AWS DeepLens plus Soracom IoT SIM
AWS DeepLens was first announced at AWS re:Invent 2017, and it’s been a long 6 months of waiting, but shipping day is here at last. Here at Soracom that means at least half the team is obsessively checking tracking updates and canceling weekend plans.
For those new to DeepLens, it’s a lot more than just a camera. It starts with the hardware, but AWS has packed in a complete Deep Learning toolkit, giving developers a fully programmable device plus the tutorials, sample code, and pre-trained models to hit the ground running. With object detection, face detection, activity detection and true Hot Dog/Not Hot Dog capability, DeepLens makes getting hands-on with Deep Learning easy and honestly really fun.
Since a lot of the fun happens out in the world where WiFi may not be accessible, we’re eager to see what we can do when we take DeepLens mobile by enabling cellular connectivity.
Fortunately, Soracom CTO and co-founder Kenta Yasukawa was able to get his hands on DeepLens early to show us how. (And just like AWS, it’s pay-as-you-go, so this example costs just a few cents a day.)
This is the AWS DeepLens device. It has a built-in camera and GPU, WiFi and USB ports, and runs Ubuntu Linux. It ships with the AWS Greengrass service and a configuration tool for self-registration with the AWS DeepLens service. (You can learn more about all that here: https://aws.amazon.com/deeplens/)
There are a bunch of sample, pre-trained models provided, along with sample projects using the models. Once you configure the device to connect to WiFi and register it to your AWS account, you can just pick a project, deploy to the device, and get rolling.
To get started, I put the DeepLens camera just to my left and ran the Object Detection project.
Deeplens object detection: monitor and programmer
The blue rectangles indicate that the model downloaded onto the DeepLens device (executed on Greengrass service as a Lambda function) and successfully detected me and the TV monitor as objects.
When an object is detected, the code executed on the DeepLens device publishes the information to an AWS IoT topic via MQTT.
MQTT client subscribed to AWS IoT Topic
In my case, Object Detection reported a personand a tvmonitorwith 92% and 44% probabilities, respectively. That means my e-body could fool their model into recognizing me as a human being. Score!
Kidding aside, this is pretty great. It lets any developer run machine learning algorithms for their applications while AWS handles the undifferentiated heavy lifting so they can focus on their applications and business logic.
As Werner Vogels emphasized in his Day 2 keynote at re:Invent 2017, this has always been the core philosophy of AWS. Working with DeepLens, I feel that AWS is opening the door to let developers dive into a new world, just as they did by making infrastructure available for software engineers to control via API.
Since we launched the SORACOM platform, the team here has maintained a similar philosophy: we solve the issues common to IoT so our users can focus on their applications and business logic instead of spending cycles on the undifferentiated heavy lifting of connectivity, security and device management.
So what can we add to this particular case? One obvious use case is to add SORACOM Air to give AWS DeepLens a new cellular connectivity option. That’s exactly what I did right after I got the device and went back to my hotel room:
Taking DeepLens mobile with SORACOM Air
Back home in California, I did it again at my office. This time I can walk you through.
First, I grabbed a USB cellular modem and a SORACOM SIM card (available on Amazon.com)
USB Modem ready for Soracom SIM
2. I popped out the SIM card and inserted it into the USB modem,
USB Modem with Soracom SIM loaded
3. I connected the USB modem to AWS DeepLens
AWS DeepLens with USB 3G modem
At this point, the modem was not yet connected. But no worries. We don’t have many steps from here to get your device connected.
1. First, we ask the OS to detect the USB device as a modem. With the Huawei MS2131 that I used here, an appropriate driver is already included in Ubuntu Linux running on DeepLens. We just need to run the following command to let the OS recognize the dongle as a modem rather than a mass storage device.
$ sudo usb_modeswitch -v 12d1 -p 14fe -J
(12d1 is for Huawei, 14fe is the product code for MS2131 and -J is for running Huawei specific procedure. You may need to find right values for your modem.)
2. Next, we configure the network manager to dial up with the right APN when a modem is detected.
$ sudo nmcli con add type gsm ifname “*” con-name soracom apn soracom.io user sora password sora
Is that it? Yes, that is it. Here is the proof:
AWS DeepLens with active 3G connection!
See the difference? The steady blue LED on the USB dongle indicates that the modem has established an over-the-air IP connection.
Here is another proof that is more obvious:
!! It got a new interface that corresponds to the cellular link. Now it has dual links to reach to AWS IoT endpoint, one is wifi (
) and the other is cellular link with SORACOM Air (
This means that my AWS DeepLens device now has mobility. I can deploy this device anywhere as long as I have power and cellular coverage. WiFi is great as long as devices are static and inside a hotspot. As most IoT developers know, this is not always a case we can count on.
To emulate a situation where WiFi coverage is not available, I typed
sudo ifconfig mlan0 down
on my console to kill my local WiFi connection.
WiFi is now disabled
Poof. The WiFi was gone. The device now has access to wireless data only via cellular.
That was not a problem and the AWS DeepLens device reported its inference result!
Since DeepLens is a kind of edge computing device and talks to the cloud only to report inference results, the bandwidth offered by a cellular modem is good enough. The combination of AWS DeepLens and SORACOM Air gives developers a feasible solution for implementing intelligent, cutting edge IoT applications.
How else can SORACOM help IoT developers?
What I showed here is just one example of the IoT connectivity features available using SORACOM. In addition to the SORACOM Air connectivity service, we also offer network and application layer integration services.
For instance, you can use our network layer services to build a private, dedicated network for your cellular connected devices and your servers. This lets you isolate your devices from the public internet and still access them (e.g. via SSH) at your convenience.
You can also use SORACOM Canal to establish a private VPC peering link from our infrastructure to your AWS VPC, or use SORACOM Gate to create a virtual L2 subnet for your devices and servers and let them communicate with each other as if they were on the same LAN.
You might also leverage our application integration services to simplify device management. In reality, you’d need to provision at least one set of credentials for each IoT device so it can bootstrap and/or authenticate to a backend service such as AWS IoT and AWS DeepLens.
For example, SORACOM Endorse issues a digital certificate that assures a client has a particular combination of IMSI (identity inside SIM) and IMEI (identity unique to each modem) so you can authenticate a device. SORACOM Beam proxies requests from a device to your server, adding the device’s IMSI and IMEI to the headers.
Last but definitely not least
If you’d like to learn more about Soracom and IoT in general or otherwise come test out code with us, please feel free to join our online community at https://community.soracom.io/
You can follow us on social networks and MeetUp groups to get updated on our latest news and events:
Twitter: @SoracomIoT and @ thekentiest
Bay Area MeetUp: Soracom SV-IoT
Paris MeetUp: Soracom IoT-WS
London MeetUp: Soracom IoT-WS London