6th & 7th week - 08 / 15 November
During this week we approached technology form a totally different point of view. We had two different workshops where we were divided by groups, keeping a collaborative environment. From dissemble a machine to understanding the vary data gathering methods. Now we can understand technology not just as a mere user observing black boxes from the outside, but as users or even makers that comprehend all the work and steps that have been taken to produce any of our daily life devices. Acquiring enough knowledge to be able to ask the right question and find possible partner ships.
In order to persuade this approach, we were asked to create a forensic report of some different devices that for some reason they are no longer useful. There was a Roomba a tv, an induction stove, an iMac and a macbook. By dissembling all this different machines we learn that some could have been broken by a simple error of one of the parts or that the machine was obsolete due to the new softwares.
We should start to make ourselves some questions about all our technology goods. What is obsolescence? How much E-waste are we going to keep producing? Why companies don't offer the users the right to repair? How long is the supply change of any device? How much carbon emissions for a " simple" phone? At what cost were this devices produced? How much of the manufacturing process was actually human labor and how much automation processes? How much material and natural resources were extracted to build a device?
My group was formed by: Anna, George, Gerda, Nikita, Marina, Tatiana and me. We ripped apart a 2003 apple Macbook. Found the link to the Hack MD journal below:
Data gathering methods and the importance of data. Understanding how data can give us power. By having an open source of data we can place the power of decision making on the individuals. We were divided by groups to perform a more accessible approach to data. First it is important to set an objective, followed by asking questions and finally setting a hypothesis. There were different data gathering tools like Smart citizen kit, Pi camera, Web Scraping, Physical intervention, GPS Mobile location, Arduino LDR sensor. My group used a web scraping method for researching our hypothesis, the food from the vending machine of Iaac is not local.
MDEF: Measuring the world / A world in data activity report.
A report by: Angel Cho, Chris Ernst, Julia Steketee, Paula Del Rio, Tattiana Butts and Vikrant Mishra.
Journal Index
Objective:
We want to eat more locally produced food.
Question:
Where does our food come from?
Hypothesis:
the majority of food in the vending machine is not locally produced.
Explain one or more mistakes you’ve done during that phase?
What would you change if will do it again?
Our expectations were too high: we assumed that a lot of the data regarding food production would be available to the public.
Maybe we could re-orient our objective from location to nutrition.
Post multiple images about the tool. What tool did you use? Would you choose a different tool now?
Web scraping: Manually and Automated through python
Finding websites that have databases about food production, import and export
Oec
ITC Trade Map
How can others replicate your data capturing process again?
They can find the base code of our web scraping tool on the FabLab hackmd (Here.)
All database sources are written below.
How do you combine the tool provided with your creativity to prove your hypothesis? How long did you capture data?
We decided which categories to research, basing ourselves on the ingredients within IAAC’s specific vending machine. We started small, then built up until we reached a global scale of interconnected supply chains.
List all the materials needed, including those given to you, those you source or even things you built yourself.
Techniques used:
Resources used:
Explain the setup process. You can simply publish multiple images about your setup.
Map of our process:
Describe the raw data you collected by posting a sample i.e. a picture, a screen capture, etc.
Excel sheets generated from open food facts:
Map from open food facts
Excel sheet from ITC trade map
Interactive map from OEC
Interactive map from CIAT
Thanks to all of these sources, we managed to cross reference the information which we obtained. We noticed many differences from one resource to the other.
Data Summary | |
---|---|
Project Title | Food Origins |
Capture Start | 11-11-2021 |
Capture End | 12-11-2021 |
Original Data Format | Website html |
Submitted format | CSV file |
Total Data Points | approximately 5000 |
Number of datasets | 5 seperate files |
Data Repository | https://github.com/fablabbcn/mdef-a-world-in-data |
Post at least two images of a chart, a screen-shoot of your data, that you used to prove if your hypothesis is false.
We were surprised to see that the Natwins cookies claimed their product was “local”. However, they do not define what exactly local means, and later state that their ingredients come from the “Mediterranean”.
The mediterranean area includes 21 countries, which means that the food origins are almost untraceable (Albania, Algeria, Bosnia and Herzegovina, Croatia, Cyprus, Egypt, France, Greece, Israel, Italy, Lebanon, Libya, Malta, Monaco, Montenegro, Morocco, Slovenia, Spain, Syria, Tunisia, and Turkey)
We decided to buy a sandwich from the vending machine and trace the possible origins of the main ingredients, using OEC’s data concerning Spain’s imported products.
The unit of measurement was the value of product in USD$ and not in tonnes.
The primary ingredients of the sandwich were:
And these were the primary imports in Spain:
Of course, this only displays the probability of where each component originated if they were imported.
Sometimes it might be beneficial to see if there is an open API to access a database instead of going for web scraping the frontend data right away. In the case of Openfoodfacts.com, they offered an open and very well-documented API, offering various export formats. This allowed us to easily download and analyze the complete dataset for the product category of ‘sandwiches’. This was made possible thanks to all the data being covered by the Open Data Commons License.
It is very difficult to retrieve information about where food comes and goes
There is a lack of transparency regarding the movement of goods
There is no detailed information available to the public about food sources
Recognising that Web Scraping is an option, but not always the best or more efficient one.
Explain one or more mistakes you’ve done during that phase? What would you change if will do it again? What if you will have more time? (max 560 char)
Defining a more specific target in our hypothesis, would have allowed us to access more relevant information.
Possibly using a different context (restaurant, grocery store) would have yielded more interesting results.
Find the full group presentation here
Forensic Report: APPLE Powerbook
Examination
serial number: A1046
Brand: Apple (in California)
Model: Powerbook G4
Color: silver
Assemblied in: Taiwan
Rated: 25.4VDC
Specifications: Canadian ICES-003 Class B
Tested: FCC standards (Home or Office use)
Forensic Questions
What does it do?
An electronic device that can store large amounts of information and be given sets of instructions to organize and change it very quickly.
How does it work?
How it’s built?
We assume that part of the construction was automated and the small details had human intervention.
Why it failed, or it wasn’t used anymore?
We believe that one of the two fans were damaged possibly leading the CPU to overheat and melt.
We guess the whole laptop is not usable anymore due to its outdated components.
Steps taken
Opening the laptop
Inside the laptop
Inside the display
Inside the keyboard
Results
How many motors we find inside, does it contain a computer or microcontroller?
Did the appliance fail, why?
Conclusions
Overall, it was a lot of fun to disassemble the laptop and examine all the parts to see how they work and which ones are connected.
Opinions
What do you learn?
What surprised you?
Example images