I have trained a model to classify some symbols using Create ML.
In my app I am using VNImageRequestHandler and VNCoreMLRequest to classify image data.
If I use a CVPixelBuffer obtained from an AVCaptureSession then the classifier runs as I would expect. If I point it at the symbols it will work fairly accurately, so I know the model is trained fairly correctly and works in my app.
If I try to use a cgImage that is obtained by cropping a section out of a larger image (from the gallery), then the classifier does not work. It always seems to return the same result (although the confidence is not a 1.0 and varies for each image, it will be to within several decimal points of it, eg 9.9999).
If I pause the app when I have the cropped image and use the debugger to obtain the cropped image (via the little eye icon and then open in preview), then drop the image into the Preview section of the MLModel file or in Create ML, the model correctly classifies the image.
If I scale the cropped image to be the same size as I get from my camera, and convert the cgImage to a CVPixelBuffer with same size and colour space to be the same as the camera (1504, 1128, kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) then I get some difference in ouput, it's not accurate, but it returns different results if I specify the 'centerCrop' or 'scaleFit' options. So I know that 'something' is happening, but it's not the correct thing.
I was under the impression that passing a cgImage to the VNImageRequestHandler would perform the necessary conversions, but experimentation shows this is not the case. However, when using the preview tool on the model or in Create ML this conversion is obviously being done behind the scenes because the cropped part is being detected.
What am I doing wrong.
tl;dr
my model works, as backed up by using video input directly and also dropping cropped images into preview sections
passing the cropped images directly to the VNImageRequestHandler does not work
modifying the cropped images can produce different results, but I cannot see what I should be doing to get reliable results.
I'd like my app to behave the same way the preview part behaves, I give it a cropped part of an image, it does some processing, it goes to the classifier, it returns a result same as in Create ML.
Post
Replies
Boosts
Views
Activity
Hi all
The situation is as follows.
I have a device that I connect to via WiFi, this device amongst other things broadcasts a stream of UDP data that I want to read in my App.
Other Apps on the device (written by other developers) will also want to read these packets.
I want to be able to receive the broadcast UDP messages without blocking other Apps from doing so.
The problem I have encountered is I seem to be fetching the data in a way that is not playing nice with others.
If I start the other apps first, they can receive data, and when I connect on my App I am able to receive the UDP messages using NWListener generated connections, or using socket/bind via getaddrinfo or just directly creating a sockaddr_in
If I start my App and connect first, other Apps are not then able to retrieve any data, I presume because I am binding to the port somehow and hogging it.
I have tried setting SO_REUSEADDR, SO_REUSEPORT on the socket, with no success.
I have not been able to receive any UDP data using recvfrom etc unless I do bind, however from my research I was lead to believe I shouldn't need to do bind when just listening for udp broadcast data.
When using getifaddrs to view the various connections available to the App, I can see the device as follows.
Name: en0
Is Broadcast (checking ifa_flags & IFF_BROADCAST)
addr: 192.168.4.2
netmask: 255.255.255.0
gateway: 192.168.4.255
If I connect the WiFi device to my Mac, I can use tcdump to see the udp packets being broadcast as follows.
tcpdump -c 4 -l -n -i en0 'udp and port 4000'
Password:
tcpdump: verbose output suppressed, use -v[v]... for full protocol decode
listening on en0, link-type EN10MB (Ethernet), snapshot length 524288 bytes
11:47:27.297327 IP 192.168.4.1.49153 > 192.168.4.3.4000: UDP, length 33
11:47:27.429211 IP 192.168.4.1.49153 > 192.168.4.3.4000: UDP, length 75
11:47:27.911906 IP 192.168.4.1.49153 > 192.168.4.3.4000: UDP, length 52
11:47:28.029487 IP 192.168.4.1.49153 > 192.168.4.3.4000: UDP, length 33
4 packets captured
8 packets received by filter
0 packets dropped by kernel
I feel like I have all the pieces, but I don't understand the correct process for setting up a socket to receive broadcast UDP packets from my device in a way that doesn't affect other Apps running on the device. I am confident to try solutions either using NWListener/NWConnection solutions or BSD Sockets.