Passwords have been used to secure access to protected assets since ancient times. Despite many attempts to find a better solution, passwords are still the most widely used option for signing into services and applications in today’s digital era.
So far, all efforts to make the handling of credentials both safe and convenient have failed, which is further compounded by the fact that weak and/or stolen credentials are still the most commonly used attack vector, leading to major leaks such as this one: Largest collection ever of breached data found; 620 million accounts stolen and now for sale on dark web.
As the problems with passwords have been well known for ages, additional security capabilities – such as certificates based on public keys – were established. Applying cryptographic technology to authentication resulted in the basic concept of two-factor authentication. For instance, when using a public key stored on an external smart card, it was sufficient to remember the PIN of this card in order to login. In this case, the principle of “what the user knows” (i.e. the PIN or password) was extended by “what the user has” (e.g., a certificate on the smart card), ultimately improving the overall security. However, smart cards also have numerous drawbacks, especially when it comes to cloud services.
All things considered, new authentication methods are required.
What are the alternatives?
Due to the persistent problems with passwords as stated above, new authentication methods were developed. These further supplement the principles of “What you know” and “What you have” with additional aspects like “What you are” (biometric characteristics), “Where you are” (context evaluation), and “What you are doing” (behavior analysis). True multi-factor authentication (MFA) requires a combination of different factors based on those principles
Furthermore, the option to arbitrarily combine factors for authentication allows for fine-granular access policies, a concept known as “adaptive authentication” or “risk-based authentication.” Based on user context, different factors can be requested to ensure the appropriate trust level: If someone wants to access the Intranet from within company premises, a simple domain authentication with username/password might be sufficient. However, if the same user tries to access SAP finance data from Nigeria at 3 in the morning, additional proof or level of assurance could be realized by requesting further factors.
“Step-up authentication” takes this idea even further, by verifying the given level of trust for each document access, applying DLP (Data Loss Prevention) capabilities to ensure that confidential files require higher trust than public information.
Finally, a technique called “continuous authentication” tries to ensure that the user does not change during an already authenticated session (e.g., someone leaving his workstation or losing his mobile phone). This is realized by biometric sensors and continuous behavior analysis. Once the monitored criteria deviate from the user baseline, a re-authentication with a separate factor is enforced.
To improve user acceptance, Single Sign-On (SSO) provides a single point of authentication, which is then propagated to various target applications within an organization or security domain. An extension of this approach is the so-called “Federated Identity”. This expands the scope to cover multiple security domains and is therefore also applicable to logins to websites, services, and applications in the internet. The identity information remains in the company’s identity store, often an Active Directory. A trustworthy identity provider accesses the identity store, identifies the user and confirms the identity of that person for the requested service without passing on confidential information. The exchange of identity information often leverages standards like SAML, OpenID Connect, or WS-Federation.
Password, smart card, and some more …
Due to its ubiquity, the mobile phone is the most obvious device used for authentication and is therefore supported by all providers as a second factor. This includes the transmission of OTP’s (one-time passwords) via text message or email, but usually also covers push notifications to vendor-provided or third-party mobile apps. Also, OTPs are still available as tokens, but a push towards software-based OTP (both TOTP and HOTP) is obvious. While some vendors are still in strong support of smart cards, they are moving away from physical tokens to virtual smart cards. Other physical devices somewhat similar to traditional smart cards are U2F and FIDO2 tokens that often come as USB devices or with wireless support via bluetooth or NFC.
Today, MFA is often associated with biometric factors. As these are based on physical characteristics, they are supposed to be both more convenient for the user (one can’t forget or lose them) and more secure (as they are hard to forge). However, apart from the technological challenges (not even touching on privacy aspects), the assumed positives can also be considered a drawback: Breaking your arm might change the way you hold your phone, breaking your nose probably changes the appearance of your face. Also, numerous hacks highlighted that biometrics are not as secure as previously assumed: theguardian.com, theatlantic.com.
A new standard WebAuthN, introduced in early 2019, tries to drive adoption of web logins with biometric-based two-factor authentication. As all major browsers are either already supporting this standard or at least plan to do so, this has an actual chance to succeed.
How does the market adopt?
Market participants in the authentication segment inherently depend on interoperability with many different communication partners and hardware vendors. Therefore, the technologies used are largely based on defined standards (e.g., SAML, WebAuthN, FIDO2, etc.). Establishing trust between the identity provider and the associated services is specified in these standards and provides vendors only limited opportunities for differentiation. Hence, they primarily differ in terms of usability, supported authentication factors, and the completeness regarding the implemented authentication types:
- Many providers offer a more or less sophisticated implementation of context-aware, adaptive authentication combined with multi-factor authentication. Usually, the decision for an additionally required factor is based on static rules or simple calculations considering known environmental conditions. Products with intelligent systems for calculating risks are the exception. Overall, the risk factors included vary greatly between products; the same is also true for the granularity of the evaluation.
- The type and scope of supported factors also varies greatly from vendor to vendor: Some are focused on a small set of specific factors such as hardware tokens, while others provide a broader range of various biometric capabilities. However, almost no vendor supports a complete set of relevant factors. This must be considered as a major differentiator, and customers need to determine their specific needs regarding existing and possibly planned hardware and software purchases.
- Although step-up authentication seems a natural evolution, it is only realized to a limited extent. The underlying standards for authentication support this approach, but the target application needs to provide feedback regarding the required trust level per document. Unfortunately, this has not been widely adopted yet. Workarounds using DLP capabilities or URL tracking are not common either, as they require further infrastructure and additional document analytics features usually not found with authentication providers.
- Continuous authentication is only offered by special providers and then often provided as an OEM (SDK or similar) that has to be integrated into other authentication solutions or custom applications. This market segment is clearly still in an early stage. Currently, many providers are concentrating on B2C use cases, especially the banking sector and mainly on mobile banking. There, the balance between convenience and security is readjusted, also triggered by the new EU Payment Services Directive (PSD2). In B2B scenarios, there are more legal hurdles hindering this technology, as the use of this approach affects legal regulations regarding the collection of personal information and inherent monitoring of users.
In general, it can be noted that products in this market segment face an inevitable complexity due to the variety of technologies to be used (web technologies, desktop-related, things at network level such as Kerberos or RADIUS, etc.), which is then more or less reflected in the authentication solutions.
DCSO considers the following trends:
- Long-established vendors tend to update their product portfolio in the direction of cloud services. This often goes hand in hand with a combination of distinct product streams into a single offering.
- Increasing adaptation of open web-based standards such as SAML, OpenID Connect, FIDO and WebAuthN.
- Focus on easy-to-use login processes supported by smartphone apps, and to a certain extent biometrics (e.g., via Windows Hello for Business).
Increased market penetration of behavior-based authentication methods or true step-up authentication is currently unlikely. As far as behavior-based authentication is concerned, there is very limited acceptance of continuous monitoring of user behavior, at least in Europe. Step-up authentication depends on the willingness of application vendors to provide their respective interfaces. This is currently not to be expected though.
In the end, customers have to make a careful product selection according to the given requirements (e.g., considering already purchased hardware for authentication) and strategy (e.g., on-premises vs. cloud-first approach), and according to the acceptance of different authentication methods such as continuous authentication and respective compliance aspects.
Who we are
The “Technology Scouting & Evaluation” (TSE) service identifies and evaluates promising IT security solutions. With this service, DCSO supports companies in staying ahead of a dynamic and ever-changing market. The centralized and unbiased evaluation process is supplemented with the experience of all community members.