Fixing X.509 Certificates

This is a continuation in a series of posts about how to correctly configure a TLS client using JSSE, using The Most Dangerous Code in the World as a guide. This post is about X.509 certificates in TLS, and has some videos to show both what the vulnerabilities are, and how to fix them. I highly recommend the videos, as they do an excellent job of describing problems that TLS faces in general.

Also, JDK 1.8 just came out and has much better encryption. Now would be a good time to upgrade.

Table of Contents

Part One: we talk about how to correctly use and verify X.509 certificates.

  • What X.509 Certificates Do
  • Understanding Chain of Trust
  • Understanding Certificate Signature Forgery
  • Understanding Signature Public Key Cracking

Part Two: We discuss how to check X.509 certificates.

  • Validating a X.509 Certificate in JSSE
  • Validating Key Size and Signature Algorithm

What X.509 Certificates Do

The previous post talked about using secure ciphers and algorithms. This alone is enough to set up a secure connection, but there's no guarantee that you are talking to the server that you think you are talking to.

Without some means to verify the identity of a remote server, an attacker could still present itself as the remote server and then forward the secure connection onto the remote server. This is the problem that Netscape had.

As it turned out, another organization had come up with a solution. The ITU-T had some directory services that needed authentication, and set a system of public key certificates in a format called X.509 in a binary encoding known as ASN.1 DER. That entire system was copied wholesale for use in SSL, and X.509 certificates became the way to verify the identity of a server.

The best way to think about public key certificates is as a passport system. Certificates are used to establish information about the bearer of that information in a way that is difficult to forge. This is why certificate verification is so important: accepting any certificate means that an attacker's certificate will be blindly accepted.

X.509 certificates contain a public key (typically RSA based), and a digest algorithm (typically in the SHA-2 family, i.e. SHA512) which provides a cryptographic hash. Together these are known as the signature algorithm (i.e. "RSAWithSHA512"). One certificate can sign another by taking all the DER encoded bits of a new certificate (basically everything except "SignatureAlgorithm") and passing it through the digest algorithm to create a cryptographic hash. That hash is then signed by the private key of the organization owning the issuing certificate, and the result is stuck onto the end of the new certificate in a new "SignatureValue" field. Because the issuer's public key is available, and the hash could have only been generated by the certificate that was given as input, we can treat it as "signed" by the issuer.

So far, so good. Unfortunately, X.509 certificates are complex. Very few people understand (or agree on) the various fields that can be involved in X.509 certificates, and even fewer understand ASN.1 DER, the binary format that X.509 is encoded in (which has led to some interesting attacks on the format). So much of the original X.509 specification was vague that PKIX was created to nail down some of the extensions. Currently, these seem to be the important ones:

There are other fields in X.509, but in practice, X.509 compatibility is so broken that few of them matter. For example, nameConstraints is considered near useless and policyConstraints has been misunderstood and exploited.

So if you want to do the minimum amount of work, all you need is some approximation to a DN, maybe a basicConstraints, and if you're feeling really enthusiastic, keyUsage (although this is often ignored by implementations, see the part 2a slides for examples. Even basicConstraints, the single most fundamental extension in a certificate, and in most cases just a single boolean value, was widely ignored until not too long ago).

Peter Gutmann

Peter Gutmann is an excellent resource on X.509 certificates (although he does have a tendency to rant). Read the X.509 Style Guide, check out the X.509 bits of Godzilla Crypto Tutorial, and buy Engineering Security when it comes out of draft – it has over 500 pages of exhaustively detailed security fails.

If you're not up for that, the best overall reference is Zytrax's SSL Survival Guide, and the presentation of "Black Ops of PKI" by Dan Kaminsky is a good introduction:

Understanding Chain of Trust

In TLS, the server not only sends its own certificate (known as an "end entity certificate" or EE), but also a chain of certificates that lead up to (but not including) a root CA certificate issued by a certificate authority (CA for short). Each of these certificates is signed by the one above them so that they are known to be authentic. Certificate validation in TLS goes through a specific algorithm to validate each individual certificate, then match signatures with each one in the chain to establish a chain of trust.

Bad things can happen if the chain of trust only checks the signature and does not also check the keyUsage and the basicConstraints fields in X.509. Moxie Marlinspike has an excellent presentation at DEFCON 17 on defeating TLS, starting off with subverting the chain of trust:

Understanding Certificate Signature Forgery

Certificates should be signed with an algorithm from the SHA-2 library (i.e. at least SHA-256), to avoid forgery. This is important, because it prevents signature forgery.

Certificates are needed because they can say "this certificate is good because it has been signed by someone I trust." If you can forge a signature, then you can represent yourself as a certificate authority. In MD5 Considered harmful today, a team showed that they were able to forge an MD5 certificate in this manner:

Since the original paper, an MD5 based attack like this has been seen in the wild. A virus called Flame forged a signature (jumping through a series of extremely difficult technical hurdles), and used it to hijack the Windows Update mechanism used by Microsoft to patch machines, completely compromising almost 200 servers.

MD2 was broken in this paper, and is no longer considered a secure hash algorithm. MD4 is considered historic. As shown in the paper and video, MD5 is out, and the current advice is to avoid using the MD5 algorithm in any capacity. Mozilla is even more explicit about not using MD5 as a hash algorithm for intermediate and end entity certificates.

SHA1 has not been completely broken yet, but it is starting to look very weak. The current advice is to stop using SHA-1 as soon as practical and it has been deprecated by Microsoft. Using SHA-1 is still allowed by NIST on existing certificates though.

Federal agencies may use SHA-1 for the following applications: verifying old digital signatures and time stamps, generating and verifying hash-based message authentication codes (HMACs), key derivation functions (KDFs), and random bit/number generation. Further guidance on the use of SHA-1 is provided in SP 800-131A.

NIST's Policy on hash functions, September 28, 2012

Even the JSSE documentation itself says that SHA-2 is required, although it leaves this as an exercise for the reader:

"The strict profile suggest all certificates should be signed with SHA-2 or stronger hash functions. In JSSE, the processes to choose a certificate for the remote peer and validate the certificate received from remote peer are controlled by KeyManager/X509KeyManager and TrustManager/X509TrustManager. By default, the SunJSSE provider does not set any limit on the certificate's hash functions. Considering the above strict profile, the coder should customize the KeyManager and TrustManager, and limit that only those certificate signed with SHA-2 or stronger hash functions are available or trusted."

TLS and NIST'S Policy on Hash Functions

So, SHA-2 library. And indeed, most public certificates (over 95%) are signed this way.

Understanding Signature Public Key Cracking

An X.509 certificate has an embedded public key, almost universally RSA. RSA has a modulus component (also known as key size or key length), which is intended to be difficult to factor out. Some of these public keys were created at a time when computers were smaller and weaker than they are now. Simply put, their key size is now far too small. Those public keys may still be valid, but the security they provide isn't adequate against today's technology.

The Mozilla Wiki brings the point home in three paragraphs:

The other concern that needs to be addressed is that of RSA1024 being too small a modulus to be robust against faster computers. Unlike a signature algorithm, where only intermediate and end-entity certificates are impacted, fast math means we have to disable or remove all instances of 1024-bit moduli, including the root certificates.

The NIST recommendation is to discontinue 1024-bit RSA certificates by December 31, 2010. Therefore, CAs have been advised that they should not sign any more certificates under their 1024-bit roots by the end of this year.

The date for disabling/removing 1024-bit root certificates will be dependent on the state of the art in public key cryptography, but under no circumstances should any party expect continued support for this modulus size past December 31, 2013. As mentioned above, this date could get moved up substantially if new attacks are discovered. We recommend all parties involved in secure transactions on the web move away from 1024-bit moduli as soon as possible.

Dates for Phasing out MD5-based signatures and 1024-bit moduli

This needs the all caps treatment:

KEY SIZE MUST BE CHECKED ON EVERY SIGNATURE IN THE CERTIFICATE, INCLUDING THE ROOT CERTIFICATE.

and:

UNDER NO CIRCUMSTANCES SHOULD ANY PARTY EXPECT SUPPORT FOR 1024 BIT RSA KEYS IN 2014.

1024 bit certificates are dead, dead, dead. They cannot be considered secure. NIST has recommended at least 2048 bits in 2013, there's a website entirely devoted to appropriate key lengths and it's covered extensively in key management solutions The certificate authorities have stopped issuing them for a while, and over 95% of trusted leaf certificates and 95% of trusted signing certificates use NIST recommended key sizes.

The same caveats apply to DSA and ECC key sizes: keylength.com has the details.

Miscellaneous

OWASP lists some guidelines on creating certificates, notably "Do not use wildcard certificates" and "Do not use RFC 1918 addresses in certificates". While these are undoubtably questionable practices, I don't think it's appropriate to have rules forbidding them.

Part Two: Implementation

The relevant documentation is the Certificate Path Programmer Guide, also known as Java PKI API Programmer's Guide:

Despite listing problems in verification above, I'm going to assume that JSSE checks certificates and certificate chains correctly, and doesn't have horrible bugs in the implementation. I am concerned that JSSE may have vulnerabilities, but part of the problem is knowing exactly what the correct behavior should be and TLS does not come with a reference implementation or a reference suite. As far as I know, JSSE has not been subject to NIST PKI testing or the X.509 test suite from CPNI, and CPNI doesn't release their test suite to the public. I am also unaware of any publically available X.509 certificate fuzzing tools.

There is a certificate testing tool called tlspretense, which (once it is correctly configured) will run a suite of incorrect certificates and produce a nice report.

What I can do is make sure that weak algorithms and key sizes are disabled, even in 1.6.

Validating a Certificate in JSSE

Validating a certificate by itself is easy. Certificate validation is done by java.security.cert and basic certificate validation (including expiration checking) is done using X509Certificate :

certificate.checkValidity()

An interesting side note – although a trust store contains certificates, the fact that they are X.509 certificates is a detail. Anchors are just subject distinguished name and public key bindings. This means they don't have to be signed, and don't really have an expiration date. This tripped me (and a few others) up, but RFC 3280 and RFC 5280 are quite clear that expiration doesn't apply to trust anchors or trust stores.

Validating Key Sizes and Signature Algorithms

We need to make sure that JSSE is not accepting weak certificates. In particular, we want to check that the X.509 certificates have a decent signature algorithm and a decent key size.

Now, there is a jdk.certpath.disabledAlgorithms security setting in JDK 1.7 that looks very close to doing what we want. Setting jdk.certpath.disabledAlgorithms is covered in the previous post.

There is a security property jdk.certpath.disabledAlgorithms that validates X.509 certificates. You define it in a security.properties file like so:

jdk.certpath.disabledAlgorithms=MD2, MD4, MD5, SHA1, SHA224, SHA256, SHA384, SHA512, RSA, DSA, EC

This property is then read by the class X509DisabledAlgConstraints in SSLAlgorithmConstraints.java:

private final static AlgorithmConstraints x509DisabledAlgConstraints =
new X509DisabledAlgConstraints();

Note the "private final static" here – you can't define the security property at runtime after this instance has been loaded into memory. You can, as a workaround, set the constraints dynamically from setAlgorithmConstraints.

But there's another problem. jdk.certpath.disabledAlgorithms is only in 1.7 and is global across the JVM. We need to support JDK 1.6 and make it local to the SSLContext. We can do better.

Here's what an example configuration looks like:

ws.ssl {
  disabledSignatureAlgorithms = "MD2, MD4, MD5"
  disabledKeyAlgorithms = "RSA keySize <= 1024, DSA keySize <= 1024, EC <= 160"
}

I'll skip over the details of how parsing and algorithm decomposition is done, except to say Scala contains a parser combinator library which makes writing small parsers very easy. On configuration, each of the statements parses out into an AlgorithmConstraint that is checks to see if the certificate's key size or algorithm matches.

There's an AlgorithmChecker that checks for signature and key algorithms:

class AlgorithmChecker(val signatureConstraints: Set[AlgorithmConstraint], val keyConstraints: Set[AlgorithmConstraint]) extends PKIXCertPathChecker {
  ...
  def check(cert: Certificate, unresolvedCritExts: java.util.Collection[String]) {
    cert match {
      case x509Cert: X509Certificate =>

        val commonName = getCommonName(x509Cert)
        val subAltNames = x509Cert.getSubjectAlternativeNames
        logger.debug(s"check: checking certificate commonName = $commonName, subjAltName = $subAltNames")

        checkSignatureAlgorithms(x509Cert)
        checkKeyAlgorithms(x509Cert)
      case _ =>
        throw new UnsupportedOperationException("check only works with x509 certificates!")
    }
  }
  ...
}

and finally:

class AlgorithmChecker(val signatureConstraints: Set[AlgorithmConstraint], val keyConstraints: Set[AlgorithmConstraint]) extends PKIXCertPathChecker {
  ...
  def checkSignatureAlgorithms(x509Cert: X509Certificate): Unit = {
    val sigAlgName = x509Cert.getSigAlgName
    val sigAlgorithms = Algorithms.decomposes(sigAlgName)

    logger.debug(s"checkSignatureAlgorithms: sigAlgName = $sigAlgName, sigAlgName = $sigAlgName, sigAlgorithms = $sigAlgorithms")

    for (a <- sigAlgorithms) {
      findSignatureConstraint(a).map {
        constraint =>
          if (constraint.matches(a)) {
            logger.debug(s"checkSignatureAlgorithms: x509Cert = $x509Cert failed on constraint $constraint")
            val msg = s"Certificate failed: $a matched constraint $constraint"
            throw new CertPathValidatorException(msg)
          }
      }
    }
  }

  def checkKeyAlgorithms(x509Cert: X509Certificate): Unit = {
    val key = x509Cert.getPublicKey
    val keyAlgorithmName = key.getAlgorithm
    val keySize = Algorithms.keySize(key).getOrElse(throw new IllegalStateException(s"No keySize found for $key"))

    val keyAlgorithms = Algorithms.decomposes(keyAlgorithmName)
    logger.debug(s"checkKeyAlgorithms: keyAlgorithmName = $keyAlgorithmName, keySize = $keySize, keyAlgorithms = $keyAlgorithms")

    for (a <- keyAlgorithms) {
      findKeyConstraint(a).map {
        constraint =>
          if (constraint.matches(a, keySize)) {
            val certName = x509Cert.getSubjectX500Principal.getName
            logger.debug(s"""checkKeyAlgorithms: cert = "certName" failed on constraint $constraint, algorithm = $a, keySize = $keySize""")

            val msg = s"""Certificate failed: cert = "$certName" failed on constraint $constraint, algorithm = $a, keySize = $keySize"""
            throw new CertPathValidatorException(msg)
          }
      }
    }
  }
}

Now that we have an algorithm checker, we need to put it into the validation chain.

There are two ways of validating a chain in JSSE. The first is using CertPathValidator, which validates a certificate chain according to RFC 3280. The second is CertPathBuilder, which "builds" a certificate chain according to RFC 4158. I've been told by informed experts that CertPathBuilder is actually closer to the behavior of modern browsers, but in this case, we're just adding onto the chain of PKIXCertPathChecker. There are several layers of configuration to go through, but eventually we pass this through to the TrustManager.

However, this doesn't check the root CA certificate, because that doesn't get passed in through the PKIXCertPathChecker. So how does SSLAlgorithmConstraints get at the root certificate?

Well, it's handled through the CertPathValidator instantiation. X509TrustManagerImpl calls Validator.getInstance(validatorType, variant, trustedCerts) – this returns new PKIXValidator(variant, trustedCerts), and from there, PKIXValidator puts the trusted certs into PKIXBuilderParameters, and then calls doValidate.

public final class PKIXValidator extends Validator {

    private X509Certificate[] doValidate(X509Certificate[] chain,
            PKIXBuilderParameters params) throws CertificateException {
        try {
            setDate(params);

            // do the validation
            CertPathValidator validator = CertPathValidator.getInstance("PKIX");
            CertPath path = factory.generateCertPath(Arrays.asList(chain));
            certPathLength = chain.length;
            PKIXCertPathValidatorResult result =
                (PKIXCertPathValidatorResult)validator.validate(path, params);

            return toArray(path, result.getTrustAnchor());
        } catch (GeneralSecurityException e) {
            throw new ValidatorException
                ("PKIX path validation failed: " + e.toString(), e);
        }
    }

}

So now we've moved on to the PKIXCertPathValidator, which pulls out a trust anchor for the AlgorithmChecker.

public class PKIXCertPathValidator extends CertPathValidatorSpi {
  private PolicyNode doValidate(
              TrustAnchor anchor, CertPath cpOriginal,
              ArrayList<X509Certificate> certList, PKIXParameters pkixParam,
              PolicyNodeImpl rootNode) throws CertPathValidatorException
  {  
     ...
     AlgorithmChecker algorithmChecker = new AlgorithmChecker(anchor);
     ...
  }  
}

This means that the AlgorithmChecker can check for the weak key size in the trust anchor, but this only works if you control the validator chain. The PKIXBuilderParameters object is not passed to PKIXCertPathChecker, so we can't simply extend PKIXCertPathChecker and pull out the trust anchor we'd like – we have to do this from the TrustManager directly. Easy enough:

class CompositeX509TrustManager(trustManagers: Seq[X509TrustManager], algorithmChecker: AlgorithmChecker) extends X509TrustManager {

  def checkServerTrusted(chain: Array[X509Certificate], authType: String): Unit = {
    logger.debug(s"checkServerTrusted: chain = ${debugChain(chain)}, authType = $authType")

    // Trust anchor is at the end of the chain... there is no way to pass a trust anchor
    // through to a checker in PKIXCertPathValidator.doValidate(), so the trust manager is the
    // last place we have access to it.
    val anchor: TrustAnchor = new TrustAnchor(chain(chain.length - 1), null)
    logger.debug(s"checkServerTrusted: checking key size only on root anchor $anchor")
    algorithmChecker.checkKeyAlgorithms(anchor.getTrustedCert)

    var trusted = false
    val exceptionList = withTrustManagers {
      trustManager =>
        // always run through the trust manager before making any decisions
        trustManager.checkServerTrusted(chain, authType)
        logger.debug(s"checkServerTrusted: trustManager $trustManager using authType $authType found a match for ${debugChain(chain).toSeq}")
        trusted = true
    }

    if (!trusted) {
      val msg = s"No trust manager was able to validate this certificate chain: # of exceptions = ${exceptionList.size}"
      throw new CompositeCertificateException(msg, exceptionList.toArray)
    }
  }
}

To do this through configuration is a bit more work. We have to create a PKIXBuilderParameters object and then attach the AlgorithmChecker to it, then stick that inside ANOTHER parameters object called CertPathTrustManagerParameters and then pass that into the factory.init method. We end up with a single CompositeX509TrustManager class, and a bunch of trust managers all configured with the same AlgorithmChecker:

class ConfigSSLContextBuilder {

  def buildCompositeTrustManager(trustManagerInfo: TrustManagerConfig,
    revocationEnabled: Boolean,
    revocationLists: Option[Seq[CRL]], algorithmChecker: AlgorithmChecker) = {

    val trustManagers = trustManagerInfo.trustStoreConfigs.map {
      tsc =>
        buildTrustManager(tsc, revocationEnabled, revocationLists, algorithmChecker)
    }
    new CompositeX509TrustManager(trustManagers, algorithmChecker)
  }

  def buildTrustManager(tsc: TrustStoreConfig,
    revocationEnabled: Boolean,
    revocationLists: Option[Seq[CRL]], algorithmChecker: AlgorithmChecker): X509TrustManager = {

    val factory = trustManagerFactory
    val trustStore = trustStoreBuilder(tsc).build()
    validateStore(trustStore, algorithmChecker)

    val trustManagerParameters = buildTrustManagerParameters(
      trustStore,
      revocationEnabled,
      revocationLists,
      algorithmChecker)

    factory.init(trustManagerParameters)
    val trustManagers = factory.getTrustManagers
    if (trustManagers == null) {
      val msg = s"Cannot create trust manager with configuration $tsc"
      throw new IllegalStateException(msg)
    }

    // The JSSE implementation only sends back ONE trust manager, X509TrustManager
    trustManagers.head.asInstanceOf[X509TrustManager]
  }

  def buildTrustManagerParameters(trustStore: KeyStore,
    revocationEnabled: Boolean,
    revocationLists: Option[Seq[CRL]],
    algorithmChecker: AlgorithmChecker): CertPathTrustManagerParameters = {
    import scala.collection.JavaConverters._

    val certSelect: X509CertSelector = new X509CertSelector
    val pkixParameters = new PKIXBuilderParameters(trustStore, certSelect)
    // ...

    // Add the algorithm checker in here to check the certification path sequence (not including trust anchor)...
    val checkers: Seq[PKIXCertPathChecker] = Seq(algorithmChecker)

    // Use the custom cert path checkers we defined...
    pkixParameters.setCertPathCheckers(checkers.asJava)
    new CertPathTrustManagerParameters(pkixParameters)
  }
}

And now we can check for weak key sizes and bad certificates the same way JSSE 1.7 does.

This still isn't the best user experience, because it will result in a broken TLS connection at run time. We'd like to give the user as much information as soon as we can, and not waste our time on certificates that we know are going to fail. We can simply filter out certificates that don't pass muster.

To do this, we iterate through every trust anchor we have in the trust store, and verify that it matches our constraints.

class ConfigSSLContextBuilder {
  /**
   * Tests each trusted certificate in the store, and warns if the certificate is not valid.  Does not throw
   * exceptions.
   */
  def validateStore(store: KeyStore, algorithmChecker: AlgorithmChecker) {
    import scala.collection.JavaConverters._
    logger.debug(s"validateKeyStore: type = ${store.getType}, size = ${store.size}")

    store.aliases().asScala.foreach {
      alias =>
        Option(store.getCertificate(alias)).map {
          c =>
            try {
              algorithmChecker.checkKeyAlgorithms(c)
            } catch {
              case e: CertPathValidatorException =>
                logger.warn(s"validateKeyStore: Skipping certificate with weak key size in $alias" + e.getMessage)
                store.deleteEntry(alias)
              case e: Exception =>
                logger.warn(s"validateKeyStore: Skipping unknown exception $alias" + e.getMessage)
                store.deleteEntry(alias)
            }
        }
    }
  }
}

But we're still not done. The default trust store is used if SSLContext is initialized with null, and we don't have access to it unless we do horrible things with reflection.

However, given that the default SSLContextImpl will call out to the TrustManagerFactory and any configuration with system properties will also apply with the factory, we can use the factory method to recreate the trust manager and validate the trust certificates that way.

So given:

    val useDefault = sslConfig.default.getOrElse(false)
    val sslContext = if (useDefault) {
      logger.info("buildSSLContext: ws.ssl.default is true, using default SSLContext")
      validateDefaultTrustManager(sslConfig)
      SSLContext.getDefault
    } else {
      // break out the static methods as much as we can...
      val keyManagerFactory = buildKeyManagerFactory(sslConfig)
      val trustManagerFactory = buildTrustManagerFactory(sslConfig)
      new ConfigSSLContextBuilder(sslConfig, keyManagerFactory, trustManagerFactory).build()
    }

We can do this:

  def validateDefaultTrustManager(sslConfig: SSLConfig) {
    // This is really a last ditch attempt to satisfy https://wiki.mozilla.org/CA:MD5and1024 on root certificates.
    // http://grepcode.com/file/repository.grepcode.com/java/root/jdk/openjdk/7-b147/sun/security/ssl/SSLContextImpl.java#79

    val tmf = TrustManagerFactory.getInstance(TrustManagerFactory.getDefaultAlgorithm)
    tmf.init(null.asInstanceOf[KeyStore])
    val trustManager: X509TrustManager = tmf.getTrustManagers()(0).asInstanceOf[X509TrustManager]

    val disabledKeyAlgorithms = sslConfig.disabledKeyAlgorithms.getOrElse(Algorithms.disabledKeyAlgorithms)
    val constraints = AlgorithmConstraintsParser.parseAll(AlgorithmConstraintsParser.line, disabledKeyAlgorithms).get.toSet
    val algorithmChecker = new AlgorithmChecker(keyConstraints = constraints, signatureConstraints = Set())
    for (cert <- trustManager.getAcceptedIssuers) {
      algorithmChecker.checkKeyAlgorithms(cert)
    }
  }

And now… we're done. Now we can check for bad X.509 algorithms out of the box, and have it be local to the SSLContext.

Testing

The best way to create X.509 certificates with Java is using keytool. Unfortunately, keytool doesn't support subjectAltName in 1.6, but in 1.7 and 1.8 you can specify the subjectAltName (which is required for hostname verification) using the -ext parameter.

For example, to create your own self signed certificates (private key + public key both) for using in testing, you would specify:

keytool -genkeypair \
-keystore keystore.jks \
-dname "CN=example.com, OU=Example Org, O=Example Company, L=San Francisco, ST=California, C=US" \
-keypass changeit \
-storepass changeit \
-keyalg RSA \
-keysize 2048 \
-alias example.com \
-ext SAN=DNS:example.com \
-validity 365

And then add example.com to /etc/hosts.

You can verify your certificate with KeyStore Explorer, a GUI tool for certificates, or java-keyutil, or you can list your certificate directly:

keytool -list -v -alias example.com -storepass changeit -keystore keystore.jks

You should see:

Alias name: example.com
Creation date: Mar 26, 2014
Entry type: PrivateKeyEntry
Certificate chain length: 1
Certificate[1]:
Owner: CN=example.com, OU=Example Org, O=Example Company, L=San Francisco, ST=California, C=US
Issuer: CN=example.com, OU=Example Org, O=Example Company, L=San Francisco, ST=California, C=US
Serial number: 4180f5e0
Valid from: Wed Mar 26 10:22:59 PDT 2014 until: Thu Mar 26 10:22:59 PDT 2015
Certificate fingerprints:
	 MD5:  F3:32:40:C9:00:59:D3:32:E1:75:85:7A:A9:68:6D:F5
	 SHA1: 37:9D:90:44:AB:41:AD:8D:F5:E4:6C:03:5F:22:61:53:EF:23:67:1E
	 SHA256: 88:FF:83:43:E1:2D:F1:19:7B:3E:1D:4D:88:40:C3:8C:8A:96:2D:75:16:4F:C8:E9:0B:99:F5:0E:53:4A:C1:17
	 Signature algorithm name: SHA256withRSA
	 Version: 3

Extensions:

#1: ObjectId: 2.5.29.17 Criticality=false
SubjectAlternativeName [
  DNSName: example.com
]

#2: ObjectId: 2.5.29.14 Criticality=false
SubjectKeyIdentifier [
KeyIdentifier [
0000: 62 30 8E 8C F2 7C 7A BC   FD EB AC 75 F6 BD FD F1  b0....z....u....
0010: 3E 73 D5 A9                                        >s..
]
]

If you have a signature of SHA256withRSA and DNSName: example.com, then it worked.
And then calls to "https://example.com"would work fine. Then you can pass in your local keystore using the options defined in the customization section:

java -Djavax.net.ssl.trustStore=keystore.jks -Djavax.net.ssl.keyStore=keystore.jks -javax.net.ssl.keyStorePassword=changeit -Djavax.net.ssl.trustStorePassword=changeit

Or you can wire the certificates into an SSLContext directly using a TrustManagerFactory and a KeyManagerFactory, and then set up a server and a client from the SSLContext as shown here:

    private SSLContext sslc;
    private SSLEngine ssle1;    // client
    private SSLEngine ssle2;    // server

    private void createSSLEngines() throws Exception {
        ssle1 = sslc.createSSLEngine("client", 1);
        ssle1.setUseClientMode(true);

        ssle2 = sslc.createSSLEngine("server", 2);
        ssle2.setUseClientMode(false);
    }

X.509 certificates are one of the moving pieces of TLS that have many, many ways of going wrong. Be prepared to find out of order certificates, missing intermediate certificates, and other problematic practices.

Certificate path debugging can be turned on using the -Djava.security.debug=certpath and -Djavax.net.debug="ssl trustmanager" settings. How to analyze Java SSL errors is a good example of tracking down bugs.

Next

Certificate Revocation!

Comments