ScreenConnect: “unauthenticated attributes” are not authenticated

(Lessons from the ScreenConnect certificate-revocation episode)

An earlier blog post recounted the discovery of threat actors leveraging the ScreenConnect remote assistance application in the wild, and events leading up to DigiCert revoking the certificate previously issued to the vendor ConnectWise for signing those binaries. This follow-up is a deeper, more technical dive into a design flaw in the ScreenConnect executable that made it particularly appealing for malicious campaigns.

Trust decisions in an open ecosystem

Before discussing what went wrong with ScreenConnect, let’s cover the “sunny-day path” or how code-signing is supposed to work. To set context, let’s rewind the clock ~20 years, back to when software distribution was far more decentralized. Today most software applications are purchased through a tightly controlled app-store such as the one Apple operates for Macs and iPhones. In fact mobile devices are locked down to such an extent that it is not possible to “side-load” applications from any other source, without jumping through hoops. But this was not always the case and certainly not for the PC ecosystem. Sometime in the late 1990s with the mass adoption of the Internet, downloading software increasingly replaced the purchase of physical media. While anachronistic traces of “shrink-wrap licenses” survive in the terminology, few consumers were actually removing shrink-wrapping from a cardboard box containing installation CDs. More likely their software was downloaded using a web-browser directly from the vendor website.

That shift had a darker side: it was a boon for malware distribution. Creating packaged software with physical media takes time and expense. Convincing a retailer to devote scarce shelf-space to that product is an even bigger barriers. But anyone can create a website and claim to offer a valuable piece of software, available for nothing more than the patience required for the slow downloads over the meager “broadband” speeds of the era. Operating system vendors even encouraged this model: Sun pushed Java applets in the browsers as a way to add interactivity to static HTML pages. Applets were portable: written for the Java Virtual Machine, they can run just as well on Windows, Mac and 37 flavors of UNIX in existence at the time. MSFT predictably responded with a Windows-centric take on this perceived competitive threat against the crown jewels: ActiveX controls. These were effectively native code shared libraries, with full access to the Windows API. No sandboxing, no restrictions once execution starts. Perfect vehicle for malware distribution.

Code-signing as panacea?

Enter Authenticode. Instead of trying to constrain what applications can do once they start running, MSFT opted for a black & white model for trust decisions made at installation time based on the pedigree of the application. Authenticode was a code-signing standard that can be applied to any Windows binary: ActiveX controls, executables, DLLs, installers, Office macros and through an extensibility layer even third-party file formats although there were few takers outside of the Redmond orbit. (Java continued to use its own cross-platform JAR signing format on Windows, instead of the “native” way.) It is based on public-key cryptography and PKI, much like TLS certificates. Every software publisher generates a key-pair and obtains a digital certificate from one of a handful of trusted “certificate authorities.” The certificate associates the public-key with the identity of the vendor, for example asserting that a specific RSA public-key belongs to Google. Google can use its private-key to digitally sign any software it publishes. Consumers downloading that software can then verify the signature to confirm that the software was indeed written by Google.

A proper critique of everything wrong with this model— starting with its naive equation of “identified vendor” to “trusted/good” software you can feel confident installing— would take a separate essay. For the purposes of this blog post, let’s suspend disbelief and assume that reasonable trust decisions can be made based on a valid Authenticode signature. What else can go wrong?

Custom installers

One of the surprising properties about ScreenConnect installer is that the application is completely turnkey: after installation, the target PC is immediately placed under remote control of a particular customer. No additional configuration files to download, no questions asked of end users. (Of course this property makes ScreenConnect equally appealing for malicious actors as it does for IT administrator.) That means the installer has all the necessary configuration included somehow. For example it must know which remote server to connect for receiving remote commands. That URL will be different for every customer.

By running strings on the application, we can quickly locate this XML configuration.
For the malicious installer masquerading as bogus River “desktop app:”

This means ScreenConnect is somehow creating a different installer on-demand for every customer. The UI itself appears to support that thesis. There is a form with a handful of fields you can complete before downloading the installer. Experiments confirm that a different binary is served when those parameters are changed.

That would also imply that ConnectWise must be signing binaries on the fly. A core assumption in code-signing is a digitally signed application can not be altered without invalidating that signature. (If that were not true, signatures would become meaningless. An attacker can take an authentic, benign binary, modify it to include malicious behavior and have that binary continue to appear as the legitimate original. There have been implementation flaws in Authenticode that allowed such changes, but these were considered vulnerabilities and addressed by Microsoft.) 

But using osslcode to inspect the signature in fact shows:

    1. All binaries have the same timestamp. (Recall this is a third-party timestamp, effectively a countersignature, provided by a trusted-third party, very often a certificate authority.)

    2. All binaries have the same hash for the signed portion

Misuse of unauthenticated attributes

That second property requires some explanation. In an ideal world the signature would cover every bit of the file— except itself, to avoid creating a self-referential. There are indeed some simple code-signing standards that work this way: raw signature bytes are tacked on at the end of the file, offloading all complexity around format and key-management (what certificate should be used to verify this signature?) to the verifier.

While Authenticode signatures also appear at the end of binaries, their format is on the opposite end of the spectrum. It is based on a complex standard called Cryptographic Message Syntax (CMS) which also underlies other PKI formats including S/MIME for encrypted/signed email. CMS defines complex nested structures encoded using a binary format called ASN1. A typical Authenticode signatures features:

  • Actual signature of the binary from the software publisher
  • Certificate of the software publisher generating that signature
  • Any intermediate certificates chaining up to the issuer required to validate the signature
  • Time-stamp from trusted third-party service
  • Certificate of the time-stamping service & additional intermediate CAs

None of these fields are covered by the signature. (Although the time-stamp itself covers the publisher signature, as it is considered a “counter-signature”) More generally CMS defines a concept of “unauthenticated_attributes:” these are parts of the file not covered by the signature and by implication, can be modified without invalidating the signature.

It turns out ScreenConnect authors made a deliberate decision to abuse the Authenticode format.  They deliberately place the configuration in one of these unauthenticated attributes. The first clue to this comes from dumping strings from the binary along with the offset where they occur. In a 5673KB file the XML configuration appears within the last 2 kilobytes—  the region where we expect to find the signature itself.

The extent of this anti-pattern becomes clear when we use “osslsigncode extract-signature” to isolate the signature section:

$ osslsigncode extract-signature RiverApp.ClientSetup.exe RiverApp.ClientSetup.sig
Current PE checksum   : 005511AE
Calculated PE checksum: 0056AFC0
Warning: invalid PE checksum
Succeeded

$ ls -l RiverApp.ClientSetup.sig
-rw-rw-r-- 1 randomoracle randomoracle 122079 Jun 21 12:35 RiverApp.ClientSetup.sig

122KB? That far exceeds the amount of space any reasonable Authenticode signature could take up, even including all certificate chains. Using openssl pkcs7 subcommand to parse this structure reveals the culprit for the bloat at offset 10514:

There is a massive ~110K section using the esoteric OID “1.3.6.1.4.1.311.4.1.1”  (The prefix 1.3.6.1.4.1.311 is reserved for MSFT; any OID starting with that prefix is specific to Microsoft.)

Looking at the ASN1value we find a kitchen sink of random content:

  • More URLs
  • Additional configuration as XML files
  • Error messages encoded in Unicode (“AuthenticatedOperationSuccessText”)
  • English UI strings as ASCII strings (“Select Monitors”)
  • Multiple PNG image files

It’s important to note that ScreenConnect went out of its way to do this. This is not an accidental feature one can stumble into. Simply tacking on 110K at the end of the file will not work. Recall that the signature is encapsulated in a complex, hierarchical data structure encoded in ASN1. Every element contains a length field. Adding anything to this structure requires updating the length field for every enclosing element. That’s not simple concatenation: it requires precisely controlled edits to ASN1. (For an example, see this proof-of-concept that shows how to “graft” the unauthenticated attribute section from one ScreenConnect binary to another using the Python asn1crypto module.)

The problem with mutable installers

The risks posed by this design become apparent when we look at what ScreenConnect does after installation: it automatically grants control of the current machine to a remote third-party. To make matters worse, this behavior is stealthy by design. As discussed in the previous blog post, there are no warnings, no prompts to confirm intent and and no visual indicators whatsoever that a third-party has been given privileged access.

That would have been dangerous on its own— ripe for abuse if a ScreenConnect customers uses that binary for managing machines that are not part of their enterprise. At that point crosses the line from “remote support application”  into “remote administration Trojan” or RAT territory. But the ability to tamper with configuration in a signed binary gives malicious actors even more leeway. They do not even need to be a ScreenConnect customer. All they need to do is get their hands on one signed binary in the wild. They can now edit the configuration residing in the unauthenticated ASN1 attribute, changing the URL for command & control server to one controlled by the attacker. Authenticode signature continues to validate and the tampered binary will still get the streamlined experience from Windows: one-click install without elevation prompt. But instead of connecting to a server managed by the original customer of ScreenConnect, it will now connect to the attacker command & control server to receive remote commands.

Resolution

This by-design behavior in ScreenConnect was deemed such high risk that the certificate authority (DigiCert) who issued ConnectWise their Authenticode certificate took the extraordinary step of revoking the certificate and invalidating all previously signed binaries. ConnectWise was forced to scramble and coordinate a response with all customers to upgrade to a new version of the binary. The new version no longer embeds critical configuration data in unauthenticated signature attributes.

While the specific risk with ScreenConnect has been addressed, it is worth pointing out that nothing prevents similar installers from being created by other software publishers. No changes have been made to Authenticode verification logic in Windows to reject extra baggage appearing in signatures. It is not even clear if such a policy can be enforced. There is enough flexibility in the format to include seemingly innocuous data such as extra self-signed certificates in the chain. For that matter, even authenticated fields can be used to carry extra information, such as the optional nonce field in the time-stamp. For the foreseeable future it is up to each vendor to refrain from using such tricks and creating installers that can be modified by malicious actors.

CP

Acknowledgments: Ryan Hurst for help with the investigation and escalating to DigiCert

The story behind ScreenConnect certificate revocation

An unusual phishing site

In late May, the River security team received a notification about a new fraudulent website impersonating our service. Phishing is a routine occurrence that every industry player contends with. There are common playbooks invoked to take-down offending sites when one is discovered. 

What made this case stand out was the tactic employed by the attacker. Most phishing pages go after credentials. They present a fraudulent authentication page that mimics the real one, asking for password or OTP codes for 2FA. Yet the page we were alerted about did not have any way to log in. Instead, it advertised a fake “River desktop app.” River publishes popular mobile apps for iOS and Android, but there has never been a desktop application for Windows, macOS, or Linux.

As this screenshot demonstrates, the home page was subtly altered to replace the yellow “Sign up” button on the upper-right corner with one linking to the bogus desktop application. We observed the site always serves the same Windows app, regardless of the web browser or operating system used to view the page. Google Chrome on macOS and Firefox on Linux both received the same Windows binary, despite the fact that it could not have run successfully on those platforms.

This looked like a bizarre case of a threat actor jumping through hoops to write an entire Windows application to confuse River clients. Native Windows applications are a rare breed these days— most services are delivered through web or mobile apps. The story only got stranger once we discovered the application carried a valid code signature.

Authentic malware

Quick recap on code signing: Microsoft has a standard called “Authenticode” for digitally signing Windows applications. These signatures identify the provenance and integrity of the software, proving authorship and guaranteeing that the application has not been tampered with from the original version as published. This is crucial for making trust decisions in an open ecosystem when applications may be sourced from anywhere, not just a curated app store.

Authenticode signatures can be examined on non-Windows platforms using the open-source osslsigncode utility. This binary was signed by ConnectWise, using a valid certificate issued in 2022 from DigiCert:

Windows malware is pervasive, but malware bearing a valid digital signature is less common, and short-lived. Policies around code-signing certificates are clear on one point: if it is shown that a certificate is used to sign harmful code, the certificate authority is obligated to revoke it. (Note that code-signing certificates are governed by the same CAB Forum that sets issuance standards for TLS certificates, but under a different set of rules than the more common TLS use-case.)

ConnectWise is a well-known company that has been producing software for IT support over a decade. As it is unlikely for such a reputable business to operate malware campaigns on the side, our first theory was a case of key-compromise: a threat actor obtained the private keys that belonged to ConnectWise and started signing their own malicious binaries with it. This is the most common explanation for malware that is seemingly published by reputable companies: someone else took their keys and certificate. Perhaps the most famous case was Stuxnet malware targeting Iran’s nuclear enrichment program in 2010, using Windows binaries signed by valid certificates of two  Taiwanese companies with no relationship to either the target or (presumed) attackers.

Looking closer at the “malware” served from the fraudulent website we were investigating, we discovered something even more bizarre: the attackers did not go to the trouble of writing a new application from scratch or even vibe-coding one with AI. This was the legitimate ScreenConnect application published by ConnectWise, served up verbatim, simply renamed as a bogus River desktop application.

That was not an isolated example. On the same server, we discovered samples of the exact same binary relabeled to impersonate additional applications, including a cryptocurrency wallet. We are far from being the first or only group to observe this in the wild. Malwarebytes noted social-security scams delivering ScreenConnect installer in April this year, and Lumu published an advisory around the same time.

Fine line between remote assistance and RAT

ScreenConnect is a remote-assistance application for Windows, Mac, and even Linux systems. Once installed, it allows an IT department to remotely control a machine, for example by deploying additional software, running commands in the background, or even joining an interactive screen-sharing session with the user to help troubleshoot problems. 

Below is an example of what an IT administrator might see on the other side when using the server-side of ScreenConnect, either self-hosted or via a cloud service provided by ConnectWise. 

Example remote command invocation via ScreenConnect dashboard. Note commands are executed as the privileged NT AUTHORITY\SYSTEM user on the target system.

At least this is the intended use case. From a security perspective, ScreenConnect is a classic example of a “dual-use application.” In the right hands, it can deliver a productivity boost to overworked IT departments, helping them deliver better support to their colleagues. In the wrong hands, it becomes a weapon for malicious actors to remotely compromise machines belonging to unsuspecting users. To be clear: ScreenConnect is not alone in this capacity. There are multiple documented instances of remote-assistances apps repurposed by threat actors at scale to remotely commandeer PCs of users they had no relationship with. But there are specific design decisions in the ScreenConnect installer as well as the application itself that greatly amplify the potential for abuse:

  • The installation proceeds with no notice or consent. Because the binary carries a valid Authenticode signature, elevation to administrator privileges is automatic. Once elevated, there are no additional warnings or indications, nothing to help the consumer realize they are about to install a dangerous piece of software and cede control of their PC to an unknown third-party.
  • Once installed, remote control takes effect immediately. No reboot required, no additional dialog asking users to activate the functionality.
  • There is no indication that the PC is under remote management or that remote commands are being issued. For example, there is no system tray icon, notifications, or other visual indicators. (Compare this to how screen sharing— far less intrusive than full remote control— works with Zoom or Google Meets: users are given a clear indication that another participant is viewing their screen, along with a link to stop sharing any time.)
  • There is no desktop icon or Windows menu entry created for ScreenConnect. For a customer who was expecting to get the River desktop app, it looks like the installation silently failed because their desktop looks the same as before. To understand what happened, users would have to visit the Windows control panel, review installed programs and observe that an unexpected entry called “ScreenConnect” has appeared there.
After installation, no indication that ScreenConnect is present on the system.
  • Compounding these client-side design decisions, ScreenConnect was offering a 14-day free trial with nothing more than a valid email address required to sign up. [The trial page now states that it is undergoing maintenance— last visited June 15th.] A threat actor could take advantage of this opportunity to download customized installers such that upon completion, the machine where the installer ran will be under the control of that actor. (It is unclear if the threat actor impersonating River used a free trial with the cloud instance, or if they compromised a server belonging to an existing ScreenConnect customer. Around the same time, we found malware masquerading as a River desktop application, CISA issued a warning about a ScreenConnect server vulnerability being exploited in the wild.)

Disclosure timeline

  • May 30: Emails sent to security@ aliases for ScreenConnect and ConnectWise. No acknowledgment received in response.
    • We later determined that the company expects to receive vulnerability disclosures at a different email alias, and our initial reports did not reach the security team.
  • Jun 1: Ryan Hurst helped escalate the issue to DigiCert and outlined why this usage of a code-signing certificate contravenes CAB Forum rules. 
  • Jun 2: DigiCert acknowledged receiving the report of our investigation.
  • Jun 3: DigiCert confirms the certificate has been revoked.
    • Initial revocation time was set to June 3rd. Because Authenticode signatures also carry a trusted third-party timestamp, it is possible to revoke binaries forward from a specific point in time. This is useful when a specific key-compromise date can be identified: all binaries time-stamped before that point remain valid, all later binaries are invalidated. The malicious sample found in the wild masquerading as a River desktop app was timestamped March 20th. The most recent version of the ScreenConnect binary obtained via the trial subscription bears a timestamp of May 20th. Setting the revocation time to June 3rd has no effect whatsoever on validity of existing binaries in the wild, including those repurposed by malicious actors.
  • Jun 4: DigiCert indicates the revocation timestamp will be backdated to issuance date of the certificate (2022) once ConnectWise has additional time to publish a new version.
  • Jun 6: The ConnectWise security team gets in contact with River security team.
  • Jun  9: ConnectWise notifies customers about an impending revocation, stating that installers must be upgraded by June 10th.
  • Jun 10: ConnectWise extends deadline to June 13th.
  • Jun 13: Revocation timestamp backdated to issuance by DigiCert, invalidating all previously signed binaries. This can be confirmed by retrieving the DigiCert CRL and looking for the serial ID of the ConnectWise certificate:
$ curl -s "http://crl3.digicert.com/DigiCertTrustedG4CodeSigningRSA4096SHA3842021CA1.crl" | openssl crl -inform DER -noout -text | grep -A4 "0B9360051BCCF66642998998D5BA97CE"
    Serial Number: 0B9360051BCCF66642998998D5BA97CE
        Revocation Date: Aug 17 00:00:00 2022 GMT
        CRL entry extensions:
            X509v3 CRL Reason Code: 
                Key Compromise

Acknowledgements

  1. Ryan Hurst for help with the investigation and recognizing how this scenario represented a divergence from CAB Forum rules around responsible use of code-signing certificates. While we were not the first to spot threat actors leveraging ScreenConnect binaries in the wild, it was Ryan who brought this matter attention to the attention of the one entity— DigiCert, the issuing certificate authority— in a position to take decisive action and mitigate the risk.
  2. DigiCert team for promptly taking action to protect not only River clients, but all Windows users against all potential uses of the ScreenConnect fraudulently mislabeled as another legitimate application.

Matt Ludwigs & Cem Paya, for River Security Team