Every month I pick out 5 projects from Developer Mesh that I find interesting and share them with you. There is a diverse array of projects on the site, so narrowing it down to just five can be difficult! I hope you’ll take a few minutes to find out why each of these projects caught my eye and then hop over to mesh to see what other projects interest you.
“Apocalypse” and “fun” don’t really seem like two things that should go together. But, in actuality, it’s a whole lot easier to deal with made-up apocalyptic scenarios than it is to deal with real life disasters. I suppose that’s why post-apocalyptic or dystopian themes are so popular; you can picture yourself as the hero. I know I personally love the genre, so when I saw Pedro Kayatt’s Apocalypse Rider VR game I had to take a closer look. In this game you ride a motorcycle at high speeds, avoiding hostile traffic, as you make your way through 20 levels of scorched wasteland. And best of all – Pedro says you won’t get VR sickness when playing this game.
In my opinion, the more proactive and preventative you can be in terms of your own health the better off you’ll be. With this project, Prajjwal Bhargava, hopes to obtain models that capture anomalies relevant for disease progression and treatment monitoring. By using deep convolutional generative adversarial networks (DCGAN) to learn the range of normal anatomical variability, he believes we can achieve high accuracy in anomaly detection. And using medical imaging will enable the observation of markers correlating with disease status and treatment response.
Some of the best videos online are of people falling over or running into things. They are hilarious to watch; and this project by Siddharth Nayak looks like it will be just as entertaining. I know I would love to watch a one-wheeled balancing robot learn to balance itself using reinforcement learning. I hope Siddharth gets video of the process to share with us so we can watch as this robot trains itself to be the best unicycle robot ever.
I’ve seen a lot of projects lately that use camera systems to detect an objects surroundings. This project caught my eye because instead of visual cues it is using ultrasound sensors to autonomously navigate its environment. Avirup Basu uses a robot vehicle which has three ultrasound sensors on it that record data and send it to the application which processes the data to map the path traversed and essentially draw out a 2D occupancy matrix of the path followed by the robot and its environment.
There are so many options on the market that will send you a box of ready-to-make meals all portioned out and ready to cook. Some would say too many (and too much packaging). What I like about VeggieBox is that it is a home appliance that can examine the ingredients you have and then suggest the best recipes for you to create a tasty, healthy, home-cooked meal. Vu Pham plans to use the Movidius Neural Compute Stick to leverage deep learning on this low-powered home device. This seems like a no-brainer as it helps you to use up your food in a healthy way and avoid more food waste.
Become a Member
Interested in getting your project featured? Join us at Developer Mesh today and become a member of our amazing community of developers.