I'm trying to go by the example of the documentation "Keeping a widget up to date" by showing the current time and date, and am using the example below. It throws an error at the let futureDate statement that says, "Cannot use instance member 'components' within property initializer; property initializers run before 'self' is available"
Here's the section of code that's in the EntryView. BTW, the Widget is called CPCWidget.
var entry: Provider.Entry
let components = DateComponents(minute: 11, second: 14)
let futureDate = Calendar.current.date(byAdding: components, to: Date())!
var body: some View {
VStack(alignment: .center, spacing: 0) {
Text(futureDate, style: .relative)
.font(.largeTitle)
Text(entry.date, style: .relative)
.font(.custom("Times New Roman", size: 20))
}
}
Post
Replies
Boosts
Views
Activity
Hi.
I've been trying to build a widget from the documentation and the WWDC 2020 code-along videos.
I've been trying to preview the widget in Xcode using the code from the WWDC Videos, but keep getting an "Unrecoverable error."
Is the preview working for Wigets, or is this planned for a future update?
BTW, I did try to get a copy of the error, but that doesn't seem to be working either.
Thanks,
Dan Uff
Hi all,
I'm testing an app that uses AVFoundation for speech. It works in iOS 13 but not 14. Here's the terminal output:
2020-06-26 13:21:09.832216-0400 CPSpeak2Me[1428:34231] [plugin] AddInstanceForFactory: No factory registered for id <CFUUID 0x60000288c0c0> F8BB1C28-BAE8-11D6-9C31-00039315CD46
2020-06-26 13:21:09.863918-0400 CPSpeak2Me[1428:34268] Creating client/daemon connection: B41FDA39-C097-4773-BBEC-ACEDA0C43CB3
2020-06-26 13:21:09.912054-0400 CPSpeak2Me[1428:34268] Got the query meta data reply for: com.apple.MobileAsset.VoiceServicesVocalizerVoice, response: 0
2020-06-26 13:21:09.933646-0400 CPSpeak2Me[1428:34268] Consumed extension
2020-06-26 13:21:09.935578-0400 CPSpeak2Me[1428:34268] getLocalPath asset com.apple.MobileAsset.VoiceServicesVocalizerVoice bd610ebf3cdee47a506b603c116b87c052feb5e6 local path is '/tmp/com.apple.mobileassetd/AssetsV2/comappleMobileAssetVoiceServicesVocalizerVoice/bd610ebf3cdee47a506b603c116b87c052feb5e6.asset/AssetData', exists: NO, unable to update last access time 2 (MAOperationFailed)
2020-06-26 13:21:09.937481-0400 CPSpeak2Me[1428:34268] getLocalPath asset com.apple.MobileAsset.VoiceServicesVocalizerVoice bd610ebf3cdee47a506b603c116b87c052feb5e6 local path is '/tmp/com.apple.mobileassetd/AssetsV2/comappleMobileAssetVoiceServicesVocalizerVoice/bd610ebf3cdee47a506b603c116b87c052feb5e6.asset/AssetData', exists: NO, unable to update last access time 2 (MAOperationFailed)
2020-06-26 13:21:09.939390-0400 CPSpeak2Me[1428:34268] getLocalPath asset com.apple.MobileAsset.VoiceServicesVocalizerVoice b42ef77f9de60534a8f5a56d277d965771dead47 local path is '/tmp/com.apple.mobileassetd/AssetsV2/comappleMobileAssetVoiceServicesVocalizerVoice/b42ef77f9de60534a8f5a56d277d965771dead47.asset/AssetData', exists: NO, unable to update last access time 2 (MAOperationFailed)
2020-06-26 13:21:09.941170-0400 CPSpeak2Me[1428:34268] getLocalPath asset com.apple.MobileAsset.VoiceServicesVocalizerVoice bd76bcbd96e58eee5fd3549ef5351567ad7f7509 local path is '/tmp/com.apple.mobileassetd/AssetsV2/comappleMobileAssetVoiceServicesVocalizerVoice/bd76bcbd96e58eee5fd3549ef5351567ad7f7509.asset/AssetData', exists: NO, unable to update last access time 2 (MAOperationFailed)
2020-06-26 13:21:09.942848-0400 CPSpeak2Me[1428:34268] getLocalPath asset com.apple.MobileAsset.VoiceServicesVocalizerVoice e76e4f07eb38fe8dadb9fc4a617e9763d1db2604 local path is '/tmp/com.apple.mobileassetd/AssetsV2/comappleMobileAssetVoiceServicesVocalizerVoice/e76e4f07eb38fe8dadb9fc4a617e9763d1db2604.asset/AssetData', exists: NO, unable to update last access time 2 (MAOperationFailed)
2020-06-26 13:21:09.944485-0400 CPSpeak2Me[1428:34268] getLocalPath asset com.apple.MobileAsset.VoiceServicesVocalizerVoice 8174961b8c786e5a13efb7adec17fedf72bf6fc9 local path is '/tmp/com.apple.mobileassetd/AssetsV2/comappleMobileAssetVoiceServicesVocalizerVoice/8174961b8c786e5a13efb7adec17fedf72bf6fc9.asset/AssetData', exists: NO, unable to update last access time 2 (MAOperationFailed)
2020-06-26 13:21:09.946085-0400 CPSpeak2Me[1428:34268] getLocalPath asset com.apple.MobileAsset.VoiceServicesVocalizerVoice 3c2c67350365770ef4a3d9ae26435b97fad58d2d local path is '/tmp/com.apple.mobileassetd/AssetsV2/comappleMobileAssetVoiceServicesVocalizerVoice/3c2c67350365770ef4a3d9ae26435b97fad58d2d.asset/AssetData', exists: NO, unable to update last access time 2 (MAOperationFailed)
2020-06-26 13:21:09.947891-0400 CPSpeak2Me[1428:34268] getLocalPath asset com.apple.MobileAsset.VoiceServicesVocalizerVoice bd610ebf3cdee47a506b603c116b87c052feb5e6 local path is '/tmp/com.apple.mobileassetd/AssetsV2/comappleMobileAssetVoiceServicesVocalizerVoice/bd610ebf3cdee47a506b603c116b87c052feb5e6.asset/AssetData', exists: NO, unable to update last access time 2 (MAOperationFailed)
2020-06-26 13:21:09.949292-0400 CPSpeak2Me[1428:34268] getLocalPath asset com.apple.MobileAsset.VoiceServicesVocalizerVoice bd610ebf3cdee47a506b603c116b87c052feb5e6 local path is '/tmp/com.apple.mobileassetd/AssetsV2/comappleMobileAssetVoiceServicesVocalizerVoice/bd610ebf3cdee47a506b603c116b87c052feb5e6.asset/AssetData', exists: NO, unable to update last access time 2 (MAOperationFailed)
2020-06-26 13:21:10.021210-0400 CPSpeak2Me[1428:34268] Got the query meta data reply for: com.apple.MobileAsset.VoiceServices.VoiceResources, response: 0
2020-06-26 13:21:10.023477-0400 CPSpeak2Me[1428:34268] Consumed extension
2020-06-26 13:21:10.024774-0400 CPSpeak2Me[1428:34268] getLocalPath asset com.apple.MobileAsset.VoiceServices.VoiceResources a4a4fa16475de8a1e60dfe92582ddb15bb5fa013 local path is '/tmp/com.apple.mobileassetd/AssetsV2/comappleMobileAssetVoiceServicesVoiceResources/a4a4fa16475de8a1e60dfe92582ddb15bb5fa013.asset/AssetData', exists: YES
2020-06-26 13:21:10.235860-0400 CPSpeak2Me[1428:34296] HALBIOBufferManagerClient::GetIOBuffer: the stream index is out of range
2020-06-26 13:21:10.236138-0400 CPSpeak2Me[1428:34296] HALBIOBufferManager_Client::GetIOBuffer: the stream index is out of range
2020-06-26 13:21:10.248193-0400 CPSpeak2Me[1428:34296] [aqme] 255: AQDefaultDevice (1): output stream 0: null buffer
2020-06-26 13:21:10.248690-0400 CPSpeak2Me[1428:34296] [aqme] 1778: EXCEPTION thrown (-50): error != 0
Message from debugger: Terminated due to signal 15
```
I need a YES or NO answer, will 10.16 work on a 2012 MacBook Pro?
Hi,I have two iOS apps that use the AVFoundation framework for speech.Everything works fine. But if I go from one app into the other with the same framework, that app crashes, even if I take the other app out of memory.The two apps are using two different voices as well.While it would be almost impossible for a user to have the two apps on the same device, one never knows :-)Any ideas?Dan Uff
Hi,My app is a simple one, where someone taps on a button and a sound file is activated. But when that's done, the sound file doesnt activate.Here's the code:import SwiftUI
import AVFoundation
struct ContentView: View {
private var autoPlayer: AVAudioPlayer = AVAudioPlayer()
var body: some View {
GeometryReader { _ in
ZStack {
Color.black
.edgesIgnoringSafeArea(.all)
Circle()
.frame(width: 400, height: 400)
.foregroundColor(Color.white)
Circle()
.frame(width: 340, height: 500)
.foregroundColor(Color.red)
Button(action: {
print("Button Tapped!")
let path = Bundle.main.path(forResource: "siren", ofType: "mp3")!
do {
let playFile = try AVAudioPlayer(contentsOf: URL(fileURLWithPath: path))
playFile.play()
} catch {
print(error)
}
}) {
Text("Tap HERE for Alarm")
.font(.largeTitle)
.foregroundColor(Color.white)
.bold()
.italic()
}
}
}.navigationBarTitle(Text("Personal Alarm"))
}
func playSound()
{
print("Button Tapped!")
let path = Bundle.main.path(forResource: "siren.mp3", ofType: nil)!
let url = URL(fileURLWithPath: path)
do {
let soundFile = try AVAudioPlayer(contentsOf: url)
soundFile.play()
} catch {
print("Cannot Play Sound File!")
}
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}Any ideas?Dan Uff
Hi,I've been an iOS developer for many years, and have just written my first Mac App (thank you SwiftUI) and am getting ready to distribute it to the Mac App Store.I want to know if I need to do anything else to the Mac App, other then doing the usual bundling for the app store, such as creating a setup app, or can I distribute it like an iOS app, and anything else that I'd need to know? (Yes, I did read all the docs I could find on this subject).Thanks,Dan Uff
Hi,I have been trying to makean app with SwiftUI. Everything works, but when the app is shown on the screen (iPhone or iPad) everything is shifted to the left, dispite using GeometryReader.Such as:What I need:---> Centered <---What is displayed:<--- Everything shifted to the left.Any ideas?Dan Uffimport SwiftUI
import AVFoundation
import AVKit
struct Middle: View {
@State private var txtInput: String = ""
@State private var txtEmpty = false
var body: some View {
GeometryReader() {_ in
ZStack {
Color.init(red: 0, green: 0, blue: 1)
.edgesIgnoringSafeArea(.all)
}
VStack (alignment: .center){
Text("What would you like to say?")
.bold()
.font(.custom("Times New Roman", size: 27))
.foregroundColor(Color.white)
.multilineTextAlignment(.center)
.accessibility(hint: Text("Type what you want to say here. Tap Return when Finished."))
TextField("Tap here to start.", text: self.$txtInput)
.background(Color .white)
.frame(width: 320, height: 50, alignment: .leading)
.font(.custom("ChalkBoard", size: 30))
.textFieldStyle(RoundedBorderTextFieldStyle())
.multilineTextAlignment(.leading)
.keyboardType(.default)
.accessibility(hint: Text("Start typing here."))
// Count text as the user types:
Text("Word Count: \(self.txtInput.count)")
.bold()
.font(.custom("Chalkboard", size: 30))
.foregroundColor(Color.white)
Button(action: {
if self.txtInput.isEmpty
{
let toSay = AVSpeechUtterance(string: "I have nothing to say. Type something in the text area above the speaker button.")
let alex = AVSpeechSynthesizer()
alex.speak(toSay)
}else {
let toSay = AVSpeechUtterance(string:
self.txtInput)
let alex = AVSpeechSynthesizer()
alex.speak(toSay)
}
}) {
VStack(alignment: .center) {
Image(systemName: "speaker.2")
.resizable()
.frame(width: 80, height: 45)
.foregroundColor(Color.white)
.accessibility(hint: Text("Tap speaker to talk."))
.padding(20)
.background(Color.green)
.padding()
.clipShape(Rectangle())
.buttonStyle(PlainButtonStyle())
}
// Clear Button:
Button(action: {
//
}) {
Text("")
.font(.largeTitle)
.foregroundColor(Color.white)
}
Image(systemName: "trash")
.resizable()
.frame(width:70, height: 45)
.foregroundColor(Color.white)
.accessibility(hint: Text(""))
.padding(20)
.background(Color.green)
.padding()
.clipShape(Rectangle())
.buttonStyle(PlainButtonStyle())
.onTapGesture {
self.txtInput = ""
}.padding()
}.padding()
}
}
}
struct Middle_Previews: PreviewProvider {
static var previews: some View {
Middle()
}
}
}
Hi,I am trying to write my first Mac app with SwiftUI. I have everything working, but when the app comes up, it only shows half of the app. I have to manually size the window each time the app is run.How do I adjust the size so the entire app can be viewed at one time?Thanks,Dan Uff
Hi,I'm getting the following warning when I compile my test app:Braces here form a trailing closure separated from its callee by multiple newlines.Here's the code. Note where the warning is appearing..../*
How to present an alertView in SwiftUI
*/
import SwiftUI
struct ContentView: View {
@State private var showSheet = false
var body: some View {
VStack {
Text("Click below for an Alert")
.fontWeight(.bold)
.font(.largeTitle)
Button(action: {
self.showSheet = true
}
) {
Text("Click Me")
.bold()
.foregroundColor(.white)
.background(Color.red)
.font(.largeTitle)
.padding()
}
.alert(isPresented: $showSheet)
{ // <---- The warning is HERE.
Alert(title: Text("Alert"), message: Text("About **** time"), dismissButton: .default(Text("OK")))
}
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
}Thanks in advance,Dan Uff
Hi,I'm trying to convert an already Swift 5 game to SwiftUI. Almost everything works except when it's time to change the image based on the choice. Here's the code:import SwiftUI
import Foundation
struct ContentView: View {
@State private var randomIndex: Int = 0
var body: some View {
VStack() {
Image("TwoSides")
.resizable()
.frame(width: 170, height: 90, alignment: .center)
.position(x: 100, y: 40)
Text("Heads or Tails").fontWeight(.bold)
.frame(width: CGFloat(200), height: CGFloat(5), alignment: .center)
.position(x: 90, y: 40)
Button(action: {
let array = ["Heads","Tails"]
let randomIndex = Int(arc4random_uniform(UInt32(array.count)))
if (randomIndex == 1)
{
print("Heads")
Image("Heads")
.resizable()
.frame(width: 170, height: 90,alignment: .center)
}else {
print("Tails")
Image("Tails")
.resizable()
.frame(width: 170, height: 90,alignment: .center)
.position(x: 100, y: 40)
}
}) {
VStack {
Text("Flip Coin")
.padding(15)
.background(Color.green)
.foregroundColor(Color.white)
}
}
}
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}Any ideas would be greatful.Dan Uff
Just a heads up for those of us who are making stand alone Apple Watch apps:Before you upload them for review, make sure you remove any mention of WatchKit in the Target and Project settings of your app.For example:appname.watchkitappYou would think that XCode would remove the extension before being uploaded for review.Thanks,Dan Uff
Hi all,I'm making an app that I have to have a flip animation when someone taps a button.Here's the code:import WatchKit
import Foundation
class InterfaceController: WKInterfaceController {
@IBOutlet var coin: WKInterfaceImage!
@IBOutlet var txt: WKInterfaceLabel!
override func awake(withContext context: Any?) {
super.awake(withContext: context)
txt.setText("Heads or Tails?")
}
@IBAction func CallIt()
{
let array = ["Heads","Tails"]
let randomIndex = Int(arc4random_uniform(UInt32(array.count)))
if randomIndex == 1
{
let up = UIImage(named: "Heads")
coin.setImage(up)
txt.setText("Heads")
}else {
let down = UIImage(named: "Tails")
coin.setImage(down)
txt.setText("Tails")
}
}
}Thanks in advance,Dan Uff
Hi,I'm trying to slowly add SwiftUI ViewControllers to an existing UIKit app.With the iPhone and iPad and using a storyboard and hostingcontroller, I use the below code to go from a UIKit ViewController to a SwiftUI view:@IBSegueAction func SwiftUIAction(_ coder: NSCoder) -> UIViewController? {
return UIHostingController(coder: coder, rootView: SecondController())The above code does NOT work with the AppleTV, and it doesn't give me an error (or I would put it here) What am I missing?Thanks,Dan Uff
Hi,My app has gotten rejected four times for this problem so bear with me....A few weeks ago, I posted that I was having problems with my app showing a blank screen when someone went into Landscape Mode.Someone suggested using .navigationViewStyle(StackNavigationViewStyle()) which seemed to work the first few times, and then stopped working after I sent it to have it reviewed.I'm still having the same problem and it is becoming maddening.Here's the code in question (again):import SwiftUI
import Foundation
struct ContentView: View {
var body: some View {
NavigationView() {
List (){
Section(header: Text("Main Information"))
{
NavigationLink(destination: Introduction())
{
Image(systemName: "exclamationmark.circle").resizable()
.frame(width: 32, height: 32, alignment: .leading)
Text("Introduction").tag(1)
.font(.headline)
}
NavigationLink(destination: Definitions())
{
Image(systemName: "questionmark.circle").resizable()
.frame(width: 32, height: 32, alignment: .leading)
Text("Definitions").tag(2)
.font(.headline)
}
NavigationLink(destination: TypesOfAbuse())
{
Image(systemName: "table").resizable()
.frame(width: 32, height: 32, alignment: .leading)
Text("Types").tag(2)
.font(.headline)
}
NavigationLink(destination: Effects())
{
Image(systemName: "exclamationmark.circle").resizable()
.frame(width: 32, height: 32, alignment: .leading)
Text("Effects").tag(2)
.font(.headline)
}
NavigationLink(destination: WhereAbuseOccures())
{
Image(systemName: "questionmark.circle").resizable()
.frame(width: 32, height: 32, alignment: .leading)
Text("Where it Occurs").tag(3)
.font(.headline)
}
NavigationLink(destination: RecognizingChildAbuse())
{
Image(systemName: "xmark.circle").resizable()
.frame(width: 32, height: 32, alignment: .leading)
Text("Recognizing Abuse").tag(4)
.font(.headline)
}
NavigationLink(destination: ReportingSuspectedChildAbuse())
{
Image(systemName: "phone.arrow.right").resizable()
.frame(width: 32, height: 32, alignment: .leading)
Text("Reporting Child Abuse").tag(5)
.font(.headline)
}
NavigationLink(destination: ChildrenCanBeHelped())
{
Image(systemName: "person.circle").resizable()
.frame(width: 32, height: 32, alignment: .leading)
Text("Children Can Be Helped").tag(7)
.font(.headline)
}
NavigationLink(destination: CountryMenu())
{
Image(systemName: "phone.circle").resizable()
.frame(width: 32, height: 32, alignment: .leading)
Text("Phone Numbers").tag(8)
.font(.headline)
}
NavigationLink(destination: Search())
{
Image(systemName: "magnifyingglass.circle").resizable()
.frame(width: 32, height: 32, alignment: .leading)
Text("Search").tag(9)
.font(.headline)
}
}
Section(header: Text("App Information & Support"))
{
NavigationLink(destination: About())
{
Text("About").fontWeight(.bold)
}
NavigationLink(destination: Support())
{
Text("Support").fontWeight(.bold)
}
}
Section(header: Text("Version"))
{
Text("1.0").fontWeight(.bold)
}
}
.navigationViewStyle(StackNavigationViewStyle())
.navigationBarTitle(Text("Child Abuse"))
.listStyle(GroupedListStyle())
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
Group {
ContentView()
.previewDevice("iPhone XS Max")
.environment(\.colorScheme, .light)
}
}
}
}
extension View {
func phoneOnlyStackNavigationView() -> some View {
if UIDevice.current.userInterfaceIdiom == .phone {
return AnyView(self.navigationViewStyle(StackNavigationViewStyle()))
} else {
return AnyView(self)
}
}
}As you can see on line 126, I have the suggested option.By the way, lines 149 - 157 were from a website to try and get this working, but to no advail.Thanks,Dan Uff