Detect a lamp that turn on and off with the camera

Hello


I'd like to do an app in Swift that allows me to detect a morse code with the camera, I mean, if you point the camera on a lamp that turn off and on, the app will detect the morse code and translate it to english or other language. It's like a qr reader but with light's flashes.

I've got some basics in Swift but I have no idea on how I should do it and I already searched on the Internet without success.

I think I should do a target/point that the user will have to point and the app will detect a color change on the target but I don't know how to do it.


Thanks for your help.


Sigma


PS: Sorry if there are some mistakes I'm not English ...

Just some ideas to start.


You can read this, whic shows the idea has been tried:

h ttps://stackoverflow.com/questions/4888251/would-it-be-possible-for-a-mobile-app-to-detect-a-flashing-light-with-its-camera


I understand you want to read morse in real time, not record a video and process later.

So, you need to read a sensor.

There is an ambient light sensor, but it's private API, so its use is not allowed for apps in the Appstore.


But there is another way: if the light is directed at screen, then that will change the screen brightness ; and you can read it :

h ttps://stackoverflow.com/questions/6309643/reading-the-iphones-ambient-light-sensor


I would thus try the following in the app:

- when you are in Morse reading func

- set the brightness mode to automatic (you could ask user to do it manually)

- repeatedly read the value of screen brightness

- measure the time the value is over a threashold you decide (such as 0.5)

- depending on the duration, that will be a dot or a dash


That's just principles to start, wish you good luck.

The ambient light sensor is not likely to be a useful solution, because it's only going to detect a difference if the light is very close to the screen, or bright enough to (say) light up the entire room.


I would start here:


https://developer.apple.com/library/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/00_Introduction.html


The general idea would be to use an AVFoundation capture input to get incoming video frames, and a custom AVFoundation writer. This is similar to what you'd do if you wanted to record a movie in a non-standard format, except that you wouldn't actually write out video frames or a movie.


To detect the light intensity, I would start looking at CIFilter to see if there's something there that will basically analyze the incoming images for you (possibly cropped to a user-chosen region within the image). For example, you might look for intensity peaks in a histogram.


You might also be able to use ARKIt to track the real-time position of the light source, if the user might move or reorient the device while your app is watching. Whether this is feasible would depend on the environment in which it's going to be used.

Sure, that's certainly much more robust and powerful than what I wrote.

In AVFoundation writer, you need to perform some image anlysis to track the source and analyse if it lit or not. Some work here I assume.


My point was more to have a solution for some testing, certainly not an app on appstore.

Hello Sigma_Philein
I was very interested in you application and was wondering if you ever did manage to develop your idea ?
I would be interested in purchasing such an app
ive searched for a while to find a morse code “light flashing” decoder !
But so far .. no luck
73’
Peter

Yes, this is possible using computer vision. Computer vision algorithms can be used to detect when a lamp is turned on or off based on the image or video output of the camera. This can be done by training the algorithms on different images of the lamp in different states (on/off) and then applying the algorithm to the camera's frames to detect the state of the lamp.

https://bit.ly/3EQiZ9A

Detect a lamp that turn on and off with the camera
 
 
Q