We have to convert a local DOC file to PDF without any server interaction. It will be in offline mode.
Any suggestion will be appreciated.
WWDC23 Community
RSS for tagEngage with fellow WWDC23 participants.
Posts under WWDC23 Community tag
9 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I have a basic Widget with a button to toggle the home lights, the buttons triggers the following AppIntention:
import WidgetKit
import AppIntents
struct ConfigurationAppIntent: WidgetConfigurationIntent {
static var title: LocalizedStringResource = "Bulb state"
static var description = IntentDescription("This is an example widget.")
}
struct ToggleStateIntent: AppIntent {
static var title: LocalizedStringResource = "Toggle light state"
init(){
}
func perform() async throws -> some IntentResult {
await WizClient.shared.toggleState()
return .result()
}
}
The problem is that I must be running the app with xcode (in my phone, not simulator) to work fine, when I stop xcode the button must be pressed two times to trigger the AppIntention.
The toggle function works well on the app with a toggle component.
Here is the widget:
import WidgetKit
import SwiftUI
struct Provider: AppIntentTimelineProvider {
func placeholder(in context: Context) -> SimpleEntry {
SimpleEntry(date: Date(), configuration: ConfigurationAppIntent())
}
func snapshot(for configuration: ConfigurationAppIntent, in context: Context) async -> SimpleEntry {
SimpleEntry(date: Date(), configuration: configuration)
}
func timeline(for configuration: ConfigurationAppIntent, in context: Context) async -> Timeline<SimpleEntry> {
let timeline = Timeline(entries: [SimpleEntry(date: Date(), configuration: configuration)], policy: .atEnd)
return timeline
}
}
struct SimpleEntry: TimelineEntry {
let date: Date
let configuration: ConfigurationAppIntent
}
struct BulbActionsEntryView : View {
var entry: Provider.Entry
var body: some View {
HStack {
Button(intent: ToggleStateIntent()){
Text("Toggle")
}
}
.padding(.vertical)
}
}
struct BulbActions: Widget {
let kind: String = "BulbActions"
var body: some WidgetConfiguration {
AppIntentConfiguration(kind: kind, intent: ConfigurationAppIntent.self, provider: Provider()) { entry in
BulbActionsEntryView(entry: entry)
.containerBackground(.fill.tertiary, for: .widget)
}
}
}
extension ConfigurationAppIntent {
fileprivate static var test: ConfigurationAppIntent {
let intent = ConfigurationAppIntent()
print("Intent -> \(intent)")
return intent
}
}
#Preview(as: .systemSmall) {
BulbActions()
} timeline: {
SimpleEntry(date: .now, configuration: .test)
}
Please treat me as a beginner of Unity.
Now I want to learn to develop visionOS VR App through unity. I try to find a relatively complete route and start learning, but Unity's official website does not have much explanation for visionOS VR App, so I hope you can give me a comparison. The whole route, thank you!
Apple Developer videos once downloaded don't have the option to display captions. Is that as designed? Is there a way get the captions with the dowloaded content?
Thanks
I hope this message finds you well. I recently had the opportunity to watch the insightful session titled "Improve Core ML Integration with Async Prediction" and was thoroughly impressed by the depth of information and the practical demonstration provided. The session offered valuable insights that I believe would greatly benefit my ongoing projects and my understanding of Core ML integration.
As I am keen on implementing the demonstrated workflows and techniques within my own work, I am reaching out to kindly request access to the source code and any related material presented during the session. Having access to the code would enable me to better understand the concepts discussed and apply them more effectively in real-world scenarios.
I believe that being able to review and experiment with the actual code would significantly enhance my learning experience and the implementation efficiency of my projects. It would also serve as a valuable resource for referencing best practices in Core ML integration and async prediction techniques.
Thank you very much for considering my request. I greatly appreciate the effort that went into creating such an informative session and am looking forward to potentially exploring the material in greater depth.
Best regards,
Fabio G.
hello, how can i add tipkit into my app for WWDC. i think using a tipkit may not be suitable for playground. What would be the best approach for adding this feature to my app?
I want to create a function which will return the rssi value of currently connected WiFi network on my iPhone. I tried fetching the value from status bar but I guess that support is no longer provided in Swift. I have tried the following solution and other solutions with similar approach but it doesn't seem to work now.
private func wifiStrength() -> Int? {
let app = UIApplication.shared
var rssi: Int?
guard let statusBar = app.value(forKey: "statusBar") as? UIView, let foregroundView = statusBar.value(forKey: "foregroundView") as? UIView else {
return rssi
}
for view in foregroundView.subviews {
if let statusBarDataNetworkItemView = NSClassFromString("UIStatusBarDataNetworkItemView"), view .isKind(of: statusBarDataNetworkItemView) {
if let val = view.value(forKey: "wifiStrengthRaw") as? Int {
//print("rssi: \(val)")
rssi = val
break
}
}
}
return rssi
}
Is there a limit to the size of the object that you are wanting to capture with Object Capture. e.g could it capture a horse or other such sized animal?