China protests, Robot Police, and more: THE WEEKLY RECAP (2022#48)

December is here, and with it the year is coming to an end (that was fast!). This week, a few links on the way China is handling Covid and its people protesting; the US going pedal to the metal into cyberpunk, and a cool demonstration of Neural Radiance Fields. Let’s start:


For the past few weeks, many shocking images have been coming from China. I started seeing Chinese people protesting at the beginning of the World Cup, stating that they did not understand how people all over the globe could enjoy the tournament without any health restrictions, while they were still suffering from very strict quarantines and nonstop testing, making their everyday lives very difficult. Since then, the number of posts on social media (specially Twitter) has grown exponentially.

This feels super weird, as you do not usually see people protesting in China. From my experience with Chinese people, complaining about the government is something you do not do in public, and even in private you have to be very careful to who you speak to (the Big Brother is always listening). I always assumed that, even if a few may complain, the news would never arrive to Europe. Which makes all this news even more impressive. Such a number of protests being aired makes me think that the number of people protesting right now has to be massive. Earlier this week, Chinese officials decided to flood Twitter with porn content, as they were unable to censor everything being posted. If you are interested in the stuff happening right now, @songpinganq on Twitter is publishing a lot of examples (some of them quite difficult to watch) of people being deported to quarantine camps.

Can China avoid a wave of deaths if it lifts strict zero COVID policy?, on Nature
China attempts to curb, censor rare nationwide protests over Covid lockdowns, on France24
Twitter hit with wave of porn and spam obscuring tweets about China protests, on The Verge
How China’s Police Used Phones and Faces to Track Protesters, on The New York Times
China protests: The young people powering the demonstrations, on BBC

Come Quietly, Or There Will Be…Trouble!

This story generated very negative reactions, but I honestly think that it could be great news (if implemented wright, of course). Apparently, San Francisco has approved the use of lethal robots by its Police. Of course, when told like that, the first thing that comes to your mind are robots killing people on the streets. However, their use could have good outcomes. First, exposing machines to dangerous situations instead of people could mean that less officers get injured. Also, given that the Police is not in danger, there should be less of a need to use lethal force to reduce criminals. Last, when operated correctly, I am pretty sure robots are less prone to killing people who belong to minorities than US Police officers.
And worst case scenario, we will start watching some of our favorite science fiction films (Robocop, Terminator, Short Circuit) being represented on the streets of San Francisco.

Robots can be armed with explosives to ‘contact, incapacitate, or disorient violent, armed, or dangerous suspect’ in ‘extreme circumstances,’ says the SFPD

James Vincent – The Verge
San Francisco Considers Allowing Use of Deadly Robots by Police, on The New York Times
San Francisco approves use of remote-controlled robots to kill suspects, on The Verge

NEural Radiance Fields

A couple years ago, scrolling through some papers from a computer vision conference, I found a cool video talking about a topic I did not know anything about: NEural Radiance Fields (NERF). The basic idea is to computationally learn how light travels and bounces between the different objects present in a recorded scene. Afterwards, you can use that information to calculate how light would behave if you introduced changes in the scene (for example, if you place a new object or a person on it). This is amazing because it is a very efficient way of generating extremely detailed and high-quality digital images, which you can use to produce photorealistic visual effects (people at Disney are clearly not aware of this tech).

This week I stumbled across an article on Hackaday talking about the topic, with a video from the Corridor Crew showing some of the possibilities I just mentioned. Worth a watch!

A new technology, NERF, for “NEural Radiance Fields”, has decreased the headaches a lot.  Instead of making a 3D model of the scene and using that to predict what reaches the camera, the software starts with video of the scene and machine learns a “radiance field” – a model of how light is reflected by the scene.
If you use the radiance field to predict the light that falls on a 3D model, the software can render the 3D model as if it was lit inside the scene. So if your actor stands near a red wall, the red reflection will show on their face.

NERF – Neural Radiance Fields, on Hackaday
NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis, on

And that’s it for the week. Stay safe!
Now listening: Elènne – Butterhorn
Featured image:

Deja una respuesta

Introduce tus datos o haz clic en un icono para iniciar sesión:

Logo de

Estás comentando usando tu cuenta de Salir /  Cambiar )

Foto de Facebook

Estás comentando usando tu cuenta de Facebook. Salir /  Cambiar )

Conectando a %s