I'm using a LPMetadataProvider to get metadata for URLs.
If I do this
if itemProvider.hasItemConformingToTypeIdentifier(UTType.image.identifier) {
let item = try? await itemProvider.loadItem(forTypeIdentifier: UTType.image.identifier)
// continue with code to convert data to UIImage...
}
That seems to fail quite often, even on larger sites like Amazon where users will expect to see an icon.
I've noticed it's because sometimes the type is dyn.agq80w5pbq7ww88brrfv085u
I'm assuming this is because something about the website response or the image data does not let the system determine if it is actually an image.
If I just do this
let type: String = "dyn.agq80w5pbq7ww88brrfv085u"
if itemProvider.hasItemConformingToTypeIdentifier(type) {
let item = try? await itemProvider.loadItem(forTypeIdentifier: type)
// continue with code to convert data to UIImage...
}
Then I get the icon, so it is there and it is an image. The problem is, does this dynamic type cover everything? Should I even be doing that?
Does anyone know precisely what causes this and are there recommendations on better ways to handle it?
I know LPLinkView appears to do something to load these 'non-image' icons so I am assuming there are way.
My assumption is that it would be safe to look at the results of itemProvider.registeredTypeIdentifiers() and if it has something use that as the type rather than coming up with a hardcoded list of types to check for.
Link Presentation
RSS for tagFetch, provide, and present rich links in your app using Link Presentation.
Posts under Link Presentation tag
3 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
In iMessage when you link a twitter post you have the following preview
As you can see the content of the tweet is present.
I wanted to replicate that so I used LPLinkMetadata and even found the "hidden" metadata.value(forKey: "_summary") to get the description. Below is the full code
import SwiftUI
import LinkPresentation
class LinkViewModel : ObservableObject {
let metadataProvider = LPMetadataProvider()
@Published var metadata: LPLinkMetadata?
@Published var image: UIImage?
init(link : String) {
guard let url = URL(string: link) else {
return
}
metadataProvider.startFetchingMetadata(for: url) { (metadata, error) in
guard error == nil else {
assertionFailure("Error")
return
}
DispatchQueue.main.async {
self.metadata = metadata
}
guard let imageProvider = metadata?.imageProvider else { return }
imageProvider.loadObject(ofClass: UIImage.self) { (image, error) in
guard error == nil else {
// handle error
return
}
if let image = image as? UIImage {
// do something with image
DispatchQueue.main.async {
self.image = image
}
} else {
print("no image available")
}
}
}
}
}
struct MetadataView : View {
@StateObject var vm : LinkViewModel
var body: some View {
VStack {
if let metadata = vm.metadata {
Text(metadata.title ?? "no title")
Text(metadata.value(forKey: "_summary") as? String ?? "np description" )
}
if let uiImage = vm.image {
Image(uiImage: uiImage)
.resizable()
.frame(width: 100, height: 100)
}
}
}
}
struct ContentView: View {
var links = [ "https://www.google.com", "https://www.hotmail.com", "https://twitter.com/t3dotgg/status/1764398959513276630"]
let metadataProvider = LPMetadataProvider()
var body: some View {
List(links, id:\.self) { item in
Section{
VStack {
Text(item)
// Image(systemName: "heart.fill")
MetadataView(vm: LinkViewModel(link: item))
}
}
}
}
}
With no luck, I even tried the third party Swift OpenGraph libraires (getting the og:title), I printed the html with no luck, added user-agent and stuff and still can't.
There was a great discussion on the mastodon github about it (for server side implementation) but replicated the iMessage behaviour seems hard.
Any tips ? :-)
In iMessage you can link a twitter post and it gets the image (if any), tweet content and title. Yet as a regular dev, I think, that we cannot get the tweet content like iMessage.
From this Github issue of Mastodon, I know that in the header there is everything we need, yet in a simple swift code using LPLinkMetadata we cannot get the description. Here is the code below
import SwiftUI
import LinkPresentation
class LinkViewModel : ObservableObject {
let metadataProvider = LPMetadataProvider()
@Published var metadata: LPLinkMetadata?
@Published var image: UIImage?
init(link : String) {
guard let url = URL(string: link) else {
return
}
metadataProvider.startFetchingMetadata(for: url) { (metadata, error) in
guard error == nil else {
assertionFailure("Error")
return
}
DispatchQueue.main.async {
self.metadata = metadata
}
guard let imageProvider = metadata?.imageProvider else { return }
imageProvider.loadObject(ofClass: UIImage.self) { (image, error) in
guard error == nil else {
// handle error
return
}
if let image = image as? UIImage {
// do something with image
DispatchQueue.main.async {
self.image = image
}
} else {
print("no image available")
}
}
}
}
}
struct MetadataView : View {
@StateObject var vm : LinkViewModel
var body: some View {
VStack {
if let metadata = vm.metadata {
Text(metadata.title ?? "no title")
Text(metadata.value(forKey: "_summary") as? String ?? "np description" )
}
if let uiImage = vm.image {
Image(uiImage: uiImage)
.resizable()
.frame(width: 100, height: 100)
}
}
}
}
struct ContentView: View {
var links = [ "https://www.google.com", "https://www.hotmail.com", "https://twitter.com/t3dotgg/status/1764398959513276630"]
let metadataProvider = LPMetadataProvider()
var body: some View {
List(links, id:\.self) { item in
Section{
VStack {
Text(item)
MetadataView(vm: LinkViewModel(link: item))
}
}
}
}
}
The twitter link doesn't return any description, I also tried third party OG libraries with the og:title in Swift with no success, yet it works on iMessage.
Any tips ? :-)