Crazy week for many reasons, so do not expect a lot of content. Anyway, let’s go ahead:
Covid-19: bad for your lungs in many different ways
I am not gonna introduce the pandemic to anyone, but maybe some of you did not realize that besides the direct health issues that the virus brought, problems on different directions also came up. One of those is that commonly used face masks represent a big waste problem. Let’s say 30% of the population uses a single-use mask every day (I think the number is higher, but for the sake of simplicity). A country like France would use about 22 millions of masks every day. If a mask weights about 3 grams, that means every day we generate about 66.000 kg of waste. Multiply that for a whole year and you get more than 24 million tons of waste. Now run the numbers for all the countries and… you get the idea.
Up to now, I have not seen many people caring about that (most of my relatives do not even have a clue on where to deposit used masks). However, there are some persons trying to put this waste to use. The people at Bristol University catch on to the fact that the masks are mainly made from polypropylene, and this can be processed in a way that 3D printers can use it as a filament for printing stuff.
I am not sure at all this procedure is safe: in the end, masks can be tagged as bio-hazard, and going through the printer hot nozzle could be not enough to «kill» the virus. In any case, I think it is a cool project if only for pointing out a big eco problem that’s out there.
TURNING OLD MASKS INTO 3D PRINTER FILAMENT, on hackaday
Apple and human rights
Seems impossible to get a week without news like this. The Universal Declaration of Human Rights states that everyone has a right to choose and practice any religion. It seems that this is wet paper for muslim people in China that tries to read the Quran on an Apple device.
What really bugs me out is not that Apple does not care at all about this (and do not get me wrong, Apple is not the only corporation that does not give a shit about people rights), is the fact that during the last few years the company has presented itself as a standard bearer of privacy, ecology, and human rights. You cannot pretend to be taken seriously if you bend the knee in China because it is the market that drives your sales. Capitalism hypocrisy at its best.
Apple removed a popular Quran app in China, on the verge
Kratos as you never saw it
Oh boy, the rumours were true. It was hinted many times that some Sony exclusive games were going to be released on PC (which means higher resolutions, frame rates, etc.). God of War was announced this week, and I hope it does really well and we get additional stuff that I would love to play (The last of Us, Ghost of Tsushima).
Sony is officially bringing God of War to PC, on the verge
Should a dog’s sniff be enough to convict a person of murder?
Amazing story on Science about the use of dogs to find dead people. This is not news at all, but the debate that brings to the table is quite interesting. Is it enough that a dog determines that there was some dead body at your place to declare you guilty of a crime?
The science behind the problem is fascinating. First, we do not know how the brain of a dog works, and for sure we do not understand how they can track a dead body even when months have passed. Second, the way the dogs are trained is up for debate, as it seems that they are influenced a lot by their trainers (even if the trainers do not realize). Dogs can read you pose, your mood, your face expressions, and even catch up to your involuntary movements. At training, all of these inputs make the dog find what you want him to find, even if there is no real «signal» (smell in this case) around.
I could not stop but thinking about how this problem relates to many different applications of machine learning that we see nowadays. Given enough complex tasks, the algorithms that people use to tackle these problems are so complex (with billions of parameters to tune) that they are essentially black boxes (as the brain of the dogs that search for dead bodies). In the same way dog training is influenced by human movements or reactions, AI training sets are influenced by the biases from the humans that build them. We have seen many problems on things like face detection where the algorithms do not detect black people or women with the same accuracy of white males, which are mainly the ones working on those tasks). There is also the problem of overfitting your data, which would be the analogy of the dog finding what you want even if it is not there.
Coming back to the article, they tell the story of a man accused of murdering his son, which was condemned mainly by the fact that a dog marked some spots near his cabin as places where the son’s remains had been. Should a black box determine if you are guilty or innocent? Should we let algorithms that we do not really understand take health or safety decisions?
THE SNIFF TEST, on science
And that’s it for the week. Stay safe!
Featured image: The Detective Dog, by Julia Donaldson and Sara Ogilvie