1 Reply
      Latest reply on Aug 22, 2016 9:26 AM by dmorrow
      dmorrow Level 1 Level 1 (0 points)

        In my app I have 3 build configurations: "PROD", "STAGING" and "DEV".

        My app gets it's data from 3 different sources: "www.public-website.com", "staging.private-website.com" and "dev.private-website.com", depending on which configuration is running.

         

        The last two websites (staging & dev) do have DNS entries, and are reachable. But we don't publish those URLs to the public, and their "robots.txt" files forbid any crawling.

         

        What I'm trying to do is put the "apple-app-site-association" file on staging.private-website.com, and test my app using the STAGING build configuration. In the hopes that I can test out universal links, without disturbing the www.public-website.com. I just want to test it on STAGING, and then once it's done, put the same "apple-app-site-association" file on public.

         

        However, when I go to the Apple Search API Validation Tool, and plug in staging.private-website.com, I get this error message in the "Link To Application" section:

        • Could not extract required information for application links. Learn how to implement the recommended Universal Links.

         

        I have a file called "apple-app-site-association" at the top-level of staging, and the (scrubbed) JSON looks like this

         

        {"applinks":{"apps":[], "details":[{"appID":"ABCDEFG.com.mycompany.myapp","paths":["/path1/*"]}]}}

        sigh, oh apple. This is the dev forums, can't I get an easy to to drop in a code snippet? I'm an iOS coder, not an HTML expert.

         

        Anyway, I have tested it, and apple-app-site-association is reachable. I can get to it just fine. I get a 200 on the call, content-type is application/json. Everything appears to be correct. I've validated the json. But when I run it through the Search API Validation Tool, I get this unhelpful message.

         

        So, could this be because robots.txt does not allow any crawling - why that would matter for this case, I don't know. But I can't seem to find any other explanation.

         

        I just need verification on this, before I pester the web-team to modify robots.txt on our staging server.

         

        Of course, if that's not the case, I'm really having trouble understanding why the Search API Validation Tool can't read the apple-app-site-association file.

         

         

        New Data:

        I just inspected the certificate being used on staging.private-website.com. The domain listed for it is *.private-website.com. Would that be a problem that the cert uses a wildcard character?

        • Re: how to test Universal Links with a "staging" server
          dmorrow Level 1 Level 1 (0 points)

          OK, to answer my own question:

           

          1. Yes, it does appear that having a wildcard in my cert is the problem -- for the validator, not for the feature itself (see more below).

          2. However, universal links are testable, regardless. It's just that the validator doesn't like this.

          3. In my "assoctiated-domains" entitlement (in my app), wildcards *don't* work, even though the docs say they do. I've seen references to this in other places on the web, and I'm confirming this behavior.

           

          So, to fix the problem with wildcards not working with my entitlements, here's what I did:

           

          - I created a "User Defined Setting" in the build settings of my project. I'm calling it WEB_DOMAIN. Since I have a configuration "PROD" "STAGING" and "DEV",  I just change this value for each configuration I have.

          - Then, in Target -> Capabilities -> Associated Domains, I set it like this:

          applinks:$(WEB_DOMAIN)
          

           

          So, for whatever config I build for, it will point to the corresponding website. Works great.