ARKit

Integrate iOS device camera and motion features to produce augmented reality experiences in your app or game using ARKit.

ARKit Documentation

Posts under ARKit tag

346 Posts
Sort by:
Post not yet marked as solved
0 Replies
244 Views
Hey everyone! 👋 I’ve always been interested in AR/VR/mixed reality and was inspired by the release of the Apple Vision Pro to play around with RealityKit on the iOS. I was really impressed with how quick and simple iOS17’s Object Capture API made it to capture lifelike digital copies of objects. Objy is an iOS app with two modes: Capture and Discover. Capture mode (requires LiDAR - pro model iPhones post iPhone 12) guides the user through the process of walking and creating the capture. The app processes these into a digital copy on-device which you can upload to a profile. Discover mode presents a semi transparent box you can position in your environment (using your camera feed). Once set this anchors your object discovery feed as you swipe up and down to discover new objects. You can show your love for your discoveries with a burst of hearts, or save objects to your project like to view anywhere you want at a later time. Your profile holds your captured objects, as well as your saved objects. From your profile you can view any of these objects in your own space as well. You can share objects from your profile, sharing to other iOS devices will open them in the native iOS AR Quick View. Check out a demo video I made: https://youtube.com/shorts/_pSgPMFA1F4?si=S1M4sjW_ooG3LLrr The most recent TestFlight build adds guest mode - so you can see a shorter discovery feed without creating a profile. And it also adds Sign In with Apple. I’d love to get any feedback on the app (design, functionality, bugs!, or more). I am also always up to chatting with people passionate about AR and spatial computing, so feel free to reach out either via the support email address in the app, or on the discord I made for feedback: https://discord.gg/4FwDSApv Thanks, Marcus
Posted Last updated
.
Post not yet marked as solved
0 Replies
258 Views
Put on a record, watch the visuals. Don't really know how far I'll go with this yet. For now it is just a volumetric window, with a couple audio reactive particle systems. Got lots of ideas though! Maybe make an immersive space so I can use ARKit somehow....hmmmm.
Posted
by pj4533.
Last updated
.
Post not yet marked as solved
1 Replies
337 Views
Hi iOS fam, we launched our Android version on World Cleanup Day (September 16, 2023) & are at 7K+ installs & asking for YOUR help with testing the iOS version. We are aware of a general lag issue between the frontend & backend, so pls know that data for the map & leaderboard take about 10s to fetch. We're working on it. We really need YOUR help to try to break the app with: (1) multiple reports; (2) rapid switching between leaderboard & map; (3) map browsing; (4) etc. so we can pinpoint the problem areas & FIX them ASAP. thank you so much! 💚 app is currently in React Native, but we are investigating move to Swift (MapKit, UIKit, ARKit, Core Location) asap.
Posted
by CleanApp.
Last updated
.
Post not yet marked as solved
0 Replies
257 Views
CamPlan provides an incredibly fast way to create modern floor plans using AR & AI. It takes full advantage of RoomPlan, RealityKit, and ARKit. The users can scan their home using just their device. After the scanning process, they can observe and edit the floor plans in 2D, in 3D, and also in AR. The app provides carefully crafted images and 3D models for the pieces of furniture. CamPlan has been updated to make the most of the iOS 17 RoomPlan & RealityKit features: Support for big areas: The users can now scan bigger areas (more than 700 square ft). (RoomPlan’s MultiRoom feature) We have tested internally the feature with multiple large areas with success. Here is one example: https://camplan.app/ios17.mp4 Enhanced 3D pieces of furniture: The app can give more details about your scanned furniture. For instance, it will be able to detect how many legs the chairs have, does it have arms, etc... These details will be visible both in 2D and 3D representations. (RoomPlan’s object attribute features) Room recognition: It can also detect the rooms on your floor plan, and adds automatically Text to them. (RoomPlan’s Section feature) Floor recognition: During the scanning, CamPlan assigns the floor number to the scanned area. Therefore, it will automatically detect whether it is a ground floor, first floor, etc... (RoomPlan’s Story feature) Accessibility: The scanning feature supports VoiceOver.
Posted Last updated
.
Post not yet marked as solved
0 Replies
240 Views
We would greatly appreciate someone testing our app and giving us feedback. It is an open social network that finds users who are physically similar. And why the similarity? Because on the basis of similarity mutual sympathies can arise. You know it. You listen to the same music, you have similar opinions as me, you look the same as me, so I like you. In the app, users can share photos. We emphasize the use of the entire mobile display. You can also lock some photos and pay to unlock them. The application is localized only in the following languages: English, Spanish, German, Czech and Slovak. Minimum version iOS 13.0
Posted
by winTwins.
Last updated
.