-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathfeed.xml
804 lines (525 loc) · 109 KB
/
feed.xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.10.0">Jekyll</generator><link href="https://about.fqa.test.signpath.io/feed.xml" rel="self" type="application/atom+xml" /><link href="https://about.fqa.test.signpath.io/" rel="alternate" type="text/html" /><updated>2025-01-21T06:22:24+00:00</updated><id>https://about.fqa.test.signpath.io/feed.xml</id><title type="html">SignPath.io</title><entry><title type="html">Building trusted software for macOS</title><link href="https://about.fqa.test.signpath.io/blog/2024/11/29/building-trusted-software-for-macos" rel="alternate" type="text/html" title="Building trusted software for macOS" /><published>2024-11-29T06:00:00+00:00</published><updated>2024-11-29T06:00:00+00:00</updated><id>https://about.fqa.test.signpath.io/blog/2024/11/29/building-trusted-software-for-macos</id><content type="html" xml:base="https://about.fqa.test.signpath.io/blog/2024/11/29/building-trusted-software-for-macos"><![CDATA[<p>The App Store is a powerful force in the digital world, serving as a portal to a massive user base. Today, there are over 2.2 billion active Apple devices worldwide. Apple’s App Store alone attracts more than <a href="https://www.apple.com/newsroom/2023/05/developers-generated-one-point-one-trillion-in-the-app-store-ecosystem-in-2022/">650 million</a> weekly active users worldwide. Often, however, it makes sense to bypass the App Store and deliver your applications directly to your users, for example, via a download from your web site.</p>
<p>The downside of direct distribution are the annoying pop-ups and alerts that can plague users trying install and execute apps. To prevent that experience, you must digitally sign your app to make it trusted by macOS.</p>
<p>In this post, we’ll show you how to deliver trusted macOS apps, without the added overhead of distributing via the App Store.</p>
<h2 id="gatekeeper">Gatekeeper</h2>
<p>To protect Apple users against untrustworthy apps, macOS relies on technology called ‘Gatekeeper’. Gatekeeper scans software from any source, from an app store, an arbitrary website, an email attachment, wherever. The application that downloads the file (that browser or email client) adds the file attribute <code class="language-plaintext highlighter-rouge">com.apple.quarantine</code>. The quarantine attribute alerts Gatekeeper to both verify the digital signature, and to check whether it’s notarized, before executing it. We’ll discuss notarization later. But the critical point is that your software must be digitally signed by a private key associated with a trusted X.509 certificate.</p>
<h2 id="formats">Formats</h2>
<p>Applications on macOS are saved in the <code class="language-plaintext highlighter-rouge">.app</code> format, a file system folder with a specific layout. If you want to distribute software, you package the <code class="language-plaintext highlighter-rouge">.app</code> folder using one of two container formats:</p>
<ul>
<li>Disk Images (<code class="language-plaintext highlighter-rouge">.dmg</code>): Easiest when distributing a single file/app.</li>
<li>Installer packages (<code class="language-plaintext highlighter-rouge">.pkg</code>): Allows for specialized scenarios, such as running code during the installation.</li>
</ul>
<p>(Refer to Apple’s <a href="https://developer.apple.com/documentation/xcode/packaging-mac-software-for-distribution">official guide to packaging applications</a> for more details.)</p>
<h2 id="prereqs-for-code-signing">Prereqs for Code Signing</h2>
<p>First, make sure you have:</p>
<ul>
<li>An <a href="https://developer.apple.com/programs/enroll/">Apple developer account</a>. It costs $99 a year.</li>
<li>Xcode</li>
<li>Xcode developer tools.
<ul>
<li>Run <code class="language-plaintext highlighter-rouge">xcode-select --install</code> on the command line</li>
</ul>
</li>
</ul>
<h2 id="code-signing-certificates">Code Signing Certificates</h2>
<p>Apple software is signed using x.509 code signing certificates.</p>
<h3 id="certificate-signing-requests">Certificate Signing Requests</h3>
<p>Code Signing Certificates link metadata, such as the name and place of registration of your company, to an asymmetric, public/private key pair. You can request a certificate from a Certificate Authority (CA), providing the metadata and the public key. This request is called a Certificate Signing Request (CSR); the CA verifies that the metadata is correct before issuing and signing the certificate.</p>
<p>The private key remains under your control. It must be securely stored. Anybody in possession of the private key can sign software in the name of your organization.</p>
<h3 id="storage-of-the-private-key">Storage of the Private Key</h3>
<p>Apple’s <a href="https://developer.apple.com/help/account/create-certificates/create-a-certificate-signing-request">documentation</a> refers to creating the keypair on a Mac and storing it in the local keychain, protected by a password. To make the keys available on other build/developer machines, you have to share the key pair. Popular tools, such as <a href="https://docs.fastlane.tools/actions/match/">fastlane</a>, recommend storing certificates and encrypted private keys in a central location where they can be accessed by any developer. Of course, this introduces risk. It makes it easier for an attacker to steal the key file and encryption password to sign any software in your company’s name! Since no further authentication is required, this attack vector is especially dangerous when publishing software outside the confines and relative security of the App Store.</p>
<p>In the Windows world, the CA/Browser forum mandates that private keys for code signing certificates are generated on Hardware Security Modules (HSM) where they are protected from theft. It’s a good idea to follow these guidelines for Apple software as well.</p>
<p>Cloud-based code signing solutions (<a href="https://about.signpath.io">SignPath</a>, for one) make it easy to use an HSM for code signing. SignPath provides an option, via the web-based console, to generate a certificate signing request (CSR) for a private key generated on hardware. It’s a simple point-and-click operation, with absolutely zero configuration or provisioning hassle.</p>
<h3 id="creating-a-test-certificate">Creating a Test Certificate</h3>
<p>These days, everyone builds software on continuous integration systems. No matter which system you use, it’s a good idea to bake code signing into the build process itself. This will minimize the hassle of doing manual signing or writing scripts at build and release time.</p>
<p>But you probably don’t want to sign all your nightly builds with an official Apple code signing certificate. It’s easier to just use a test certificate. One hitch is that you need to configure your system to trust that test certificate.</p>
<p>To make it trusted, you can install the test certificate on a keychain in your macOS system and enable the “Code Signing” trust setting. Gatekeeper will still identify and complain about the missing notarization though. If you’re doing manual testing, you can right-click the application and ignore the warning. If you’re testing programmatically, you can remove the quarantine flag. Use the following command to avoid Gatekeeper checks altogether:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>xattr -d com.apple.quarantine ~/Downloads/sample.app
</code></pre></div></div>
<p><em>Important: Apple’s <code class="language-plaintext highlighter-rouge">codesign</code> tool, which is used for signing software, does not require certificates to be issued by Apple. Package Installers (<code class="language-plaintext highlighter-rouge">.pkg</code>) are a notable exception. The <code class="language-plaintext highlighter-rouge">.pkg</code> format only supports official certificates by Apple. Test certificates cannot be used.</em></p>
<h3 id="creating-a-release-certificate">Creating a Release Certificate</h3>
<p>If you want all the Apple devices in the world to trust your application, you need an official certificate from Apple. Those certificates are trusted by default on all Apple devices. You can get an official Apple certificate from the <a href="https://developer.apple.com/documentation/xcode/packaging-mac-software-for-distribution">developer portal</a>. Apple supports a variety of certificates. Depending on the chosen distribution format, you need to create either a Developer ID Installer (for <code class="language-plaintext highlighter-rouge">.pkg</code>) or a Developer ID Application (for <code class="language-plaintext highlighter-rouge">.dmg</code> and <code class="language-plaintext highlighter-rouge">.app</code>) certificate.</p>
<p><img src="/assets/posts/2024-10-25_developer_portal.png" alt="Screenshot of Apple's Developer Portal" /></p>
<p>After choosing the certificate type, you can upload your Certificate Signing Request (CSR). (Hopefully, you generated the private key for the CSR on an HSM.) Note that the underlying key pair must be generated using the <strong>RSA algorithm</strong> with a length of <strong>2048 bits</strong>. Apple will issue the certificate, and then you can import it into your code signing solution of choice.</p>
<h3 id="putting-code-signing-certificates-to-work">Putting Code Signing Certificates to Work</h3>
<p>You can’t distribute that HSM-backed key to your build/developer machines. That key will never leave the HSM. So how do you use it for code signing in your continuous integration pipeline? Simple. You just need an application known as CryptoTokenKit. It registers your certificate in the system’s keychain, along with an identifier for the private key. During the signing operation, Apple’s crypto backend calls the CryptoTokenKit extension, which in turn forwards any signing operation to the signing service (in our case, SignPath.io). If you’re interested in the technical details, there’s an excellent deep dive into the CryptoTokenKit architecture by Timothy Perfitt on <a href="https://twocanoes.com/cryptotokenkit-communication/">this website</a>.</p>
<p>The documentation for the <strong>SignPath CryptoTokenKit</strong> can be found <a href="/documentation/crypto-providers/macos">here</a>.</p>
<h2 id="notarization">Notarization</h2>
<blockquote>
<p>The Apple notary service is an automated system that scans your software for malicious content, checks for code-signing issues, and returns the results to you quickly. If there are no issues, the notary service generates a ticket for you to staple to your software; the notary service also publishes that ticket online where Gatekeeper can find it.</p>
</blockquote>
<p>If you want to avoid warning dialogs during install, notarization is required when distributing your software outside of the AppStore. For more details, see the <a href="https://developer.apple.com/documentation/security/notarizing-macos-software-before-distribution">official documentation</a>. To automate this process, Apple provides the <code class="language-plaintext highlighter-rouge">notarytool</code> and <code class="language-plaintext highlighter-rouge">stapler</code> executables. The former uploads the artifact to the Apple servers for automated scanning and the latter attaches the notarization to the software. You will have to authenticate against Apple using an <a href="https://support.apple.com/en-us/102654">app-specific password</a>. According to Apple, notarization is completed within 5 minutes for most software and within 15 minutes for 98% of uploaded software. That’s fast enough to include it in most automated builds. However, note that the first submission can take up to day.</p>
<p><em>Notarization requires a signature with a valid release certificate issued by Apple. You must use release certificates, or it won’t work.</em></p>
<h3 id="hardened-runtime">Hardened Runtime</h3>
<p>A hardened runtime is a pre-requisite for notarization. Follow the <a href="https://developer.apple.com/documentation/xcode/configuring-the-hardened-runtime">official article</a> by Apple to add a hardened runtime capability to your app. You need to list all exceptions to the restrictions explicitly.</p>
<blockquote>
<p>The Hardened Runtime is a collection of system-enforced restrictions that disable a set of functional capabilities, such as loading third-party frameworks, and prohibit access to restricted resources, such as the device’s built-in camera, to prevent certain classes of exploits from compromising the runtime integrity of your macOS app.</p>
</blockquote>
<p>When signing your software with a hardened runtime, you need to add the <code class="language-plaintext highlighter-rouge">–-options=runtime</code> parameter to the <code class="language-plaintext highlighter-rouge">codesign</code> call. Make sure to test your app thoroughly after enabling the hardened runtime to ensure that all use-cases still work.</p>
<h3 id="the-get-task-allow-entitlement">The <code class="language-plaintext highlighter-rouge">get-task-allow</code> entitlement</h3>
<p>When building the software automatically using the <code class="language-plaintext highlighter-rouge">xcodebuild</code> tool, there is another special setting to consider: Apple automatically injects an entitlement used for debugging at build time which is incompatible with the notarization process. You can disable this behavior by passing the <code class="language-plaintext highlighter-rouge">CODE_SIGN_INJECT_BASE_ENTITLEMENTS=NO</code> flag to your <code class="language-plaintext highlighter-rouge">xcodebuild</code> call. See <a href="https://developer.apple.com/documentation/security/resolving-common-notarization-issues#Avoid-the-get-task-allow-entitlement">here</a> for more technical details.</p>
<h2 id="sample">Sample</h2>
<p>Let’s get hands on! First you’ll need a Free Trial account on Signpath. Sign up here: https://login.signpath.io/</p>
<p>Below is a code snippet that highlights how to sign and notarizes an application. For a full working sample, visit the <a href="https://github.com/SignPath/demo-macos">demo repository</a>.</p>
<div class="language-bash highlighter-rouge"><div class="highlight"><pre class="highlight"><code>
<span class="c">### Install SignPath MacOSCryptoTokenKit</span>
curl <span class="nt">-o</span> SignPathCryptoTokenKit.dmg https://download.signpath.io/cryptoproviders/macos-cryptotokenkit/2.0.0/SignPathCryptoTokenKit.dmg
codesign <span class="nt">-dv</span> <span class="nt">--verbose</span> SignPathCryptoTokenKit.dmg <span class="c"># check signature</span>
hdiutil attach ./SignPathCryptoTokenKit.dmg <span class="nt">-mountroot</span> ./tools <span class="c"># mount the disk image</span>
<span class="c">### Register the CryptoTokenKit</span>
open <span class="s2">"./tools/SignPathCryptoTokenKit/SignPathCryptoTokenKit.app"</span> <span class="nt">--args</span> <span class="se">\</span>
<span class="nt">--organization-id</span> <span class="nv">$SIGNPATH_ORGANIZATION_ID</span> <span class="se">\</span>
<span class="nt">--project-slug</span> <span class="nv">$SIGNPATH_PROJECT_SLUG</span> <span class="se">\</span>
<span class="nt">--signing-policy-slug</span> <span class="nv">$SIGNPATH_SIGNING_POLICY_SLUG</span>
<span class="nb">sleep </span>20 <span class="c"># wait for token to be registered</span>
<span class="c">### Sign the sample.app file</span>
codesign <span class="nt">-f</span> <span class="nt">--timestamp</span> <span class="nt">--options</span><span class="o">=</span>runtime <span class="se">\</span>
<span class="nt">-s</span> <span class="nv">$CERTIFICATE_SUBJECT_NAME</span> <span class="se">\</span>
<span class="nt">--entitlements</span> sample/sample/sample.entitlements <span class="se">\</span>
./build/sample.app
codesign <span class="nt">-dv</span> <span class="nt">--verbose</span> ./build/sample.app <span class="c"># check signature</span>
<span class="c">### Create and sign a .dmg file</span>
hdiutil create <span class="nt">-format</span> UDZO <span class="nt">-srcfolder</span> ./build/sample.app ./build/sample.dmg
codesign <span class="nt">-f</span> <span class="nt">--timestamp</span> <span class="nt">--options</span><span class="o">=</span>runtime <span class="se">\</span>
<span class="nt">-s</span> <span class="nv">$CERTIFICATE_SUBJECT_NAME</span> <span class="se">\</span>
<span class="nt">--entitlements</span> sample/sample/sample.entitlements <span class="se">\</span>
./build/sample.dmg
codesign <span class="nt">-dv</span> <span class="nt">--verbose</span> ./build/sample.dmg <span class="c"># check signature</span>
<span class="c">### Notarize the .dmg file</span>
xcrun notarytool submit ./build/sample.dmg <span class="se">\</span>
<span class="nt">--apple-id</span> <span class="nv">$APPLE_ID</span> <span class="se">\</span>
<span class="nt">--team-id</span> <span class="nv">$APPLE_TEAM_ID</span> <span class="se">\</span>
<span class="nt">--password</span> <span class="nv">$APPLE_NOTARIZATION_APP_SPECIFIC_PASSWORD</span> <span class="se">\</span>
<span class="nt">--wait</span> <span class="se">\</span>
<span class="nt">--timeout</span> 15m
xcrun stapler staple ./build/sample.dmg <span class="c"># staple the notarization result</span>
</code></pre></div></div>
<h2 id="conclusion">Conclusion</h2>
<p>You now have everything you need to start building and releasing trusted applications for macOS, directly to your user base. The Apple App Store and enterprise distribution channels provide safe and convenient ways to reach users, but direct distribution is often preferred for Enterprise software. However, it does come with the added responsibility of properly signing and notarizing your applications. By following these requirements and keeping your keys secure, you can safely distribute your application outside of the App Store, while still ensuring a quality end user experience with your trustworthy apps.</p>]]></content><author><name>Paul Savoie</name></author><category term="blog" /><summary type="html"><![CDATA[description]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://about.fqa.test.signpath.io/2024-09-09-bg" /><media:content medium="image" url="https://about.fqa.test.signpath.io/2024-09-09-bg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">From Implicit to Explicit: Why Code Signing is the Missing Link in DevSecOps</title><link href="https://about.fqa.test.signpath.io/blog/2024/09/10/implicit-to-explicit" rel="alternate" type="text/html" title="From Implicit to Explicit: Why Code Signing is the Missing Link in DevSecOps" /><published>2024-09-10T08:00:00+00:00</published><updated>2024-09-10T08:00:00+00:00</updated><id>https://about.fqa.test.signpath.io/blog/2024/09/10/implicit-to-explicit</id><content type="html" xml:base="https://about.fqa.test.signpath.io/blog/2024/09/10/implicit-to-explicit"><![CDATA[<h1 id="from-implicit-to-explicit-why-code-signing-is-the-missing-link-in-devsecops">From Implicit to Explicit: Why code signing is the missing link in DevSecOps</h1>
<p>“Trust, but verify”: this is a well-known proverb that defined the Cold War era. Today, these powerful words alone could be used to describe the philosophy behind the security of the world’s digital infrastructure, from satellites to web browsers.</p>
<p>Thanks to modern cryptographic techniques, especially asymmetric key encryption, we can ensure the integrity and authenticity of billions of website visits and software downloads worldwide. Users can remain blissfully unaware of the behind-the-scenes process to benefit from it – a hallmark of great security. This invisible protection works silently to keep our digital interactions safe and trustworthy.</p>
<p>However, there’s an inconvenient truth for us software professionals who see how “the sausage is made”: when it comes to assembling software for professional use, the benefits of modern crytography, in the form of code signing, prove to be the exception rather than the rule.</p>
<p>In this blog post, I want to emphasize the importance of explicitly establishing trust in our software supply chains. As more DevSecOps initiatives launch to build a more resilient global digital infrastructure, we can no longer justify trusting without verifying the software components in our pipelines.</p>
<h2 id="why-devsecops-needs-code-signing">Why DevSecOps Needs Code Signing</h2>
<p>Today, the importance of protecting supply chains cannot be overstated: Gartner predicts that by 2025, 45% of organizations worldwide will have experienced attacks on their software supply chains (a three-fold increase from 2021). The reason is straightforward: with the rise of DevOps and open source, software supply chains have rapidly grown more complex. This has created an attack surface that is not only much larger but also harder to measure and, in many cases, beyond direct control.</p>
<p>Code signing has long been a pillar of IT security. Yet applications have grown in complexity. Applications often consist of hundreds or even thousands of components. Such complexity makes code signing seem almost impossible. Many administrators find that the limited guarantees of incomplete verification simply aren’t worth the hassle.</p>
<p>Where it is in use, code signing often occurs too late in the development cycle to be of much use. Typically, DevOps teams only sign the final, assembled software build before delivery. Even worse, software is often signed without any mechanism in place to verify the signature!</p>
<p>There are many drivers behind lack of adption, including limited awareness, budget constraints, and competing priorities. The challenge is reminiscent of the problem DevSecOps was created to address: by integrating security and development throughout the entire process, we can ensure that security becomes an integral part of the software lifecycle rather than an afterthought.</p>
<p>So why isn’t code signing integrated in the same way as other DevOps tools?</p>
<p>Code signing is the only surefire way to guarantee that pipelines haven’t been tampered with, and that they produce the expected output. It’s time we raise the bar to ensure the integrity of entire build pipelines instead of beefing up each individual step (code, build, deliver) separately.</p>
<h2 id="the-problem-with-traditional-code-signing">The Problem with Traditional Code Signing</h2>
<p>Code-signing means attaching a digital certificate to software. Traditionally, within a Public Key Infrastructure (PKI), a certificate authority (CA) verifies the developer’s identity and adds its public key to the certificate. The developer then hashes the source code, encrypts the hash (digest) with their private key, and combines this with the certificate and hash function to create a signature block. This block is then inserted into the software, completing the code-signing process.</p>
<p>The problem with this process arises when the build environment grows more complex and dynamic:</p>
<ul>
<li>In DevOps, build pipelines involve many steps that require careful verification. Third-party dependencies need thorough vetting before approval. Securing continuous integration (CI) systems is already a challenge for DevOps engineers who must balance security and velocity. Imposing traditional code signing can seem impractical, if not impossible.</li>
<li>Code signing solutions must be selective about what they accept to sign. They can’t simply sign anything presented to them. Instead, they need to verify that each signature request is valid and comes from a trusted source, following a specific set of rules.</li>
<li>As more developers need permission to sign software, encryption keys must be both accessible and carefully safeguarded. Key management is a huge challenge in itself: code signing must use keys that are protected on hardware. So-called hardware signing modules (HSMs) were simply not designed to support modern CI/CD pipelines, which require flexibility and agility.</li>
</ul>
<p>In short, traditional code signing demands complex security infrastructure, often beyond the capacity of all but the largest organizations teams to handle. For this reason, code signing adoption by DevSecOps teams has lagged other software supply chain security capabilties, such as software composition analysis (SCA).</p>
<p>But source code integrity and trusted build systems don’t have to be daunting. Let’s explore a few simple, pragmatic solutions.</p>
<h2 id="how-signpath-brings-code-signing-to-devsecops">How SignPath Brings Code Signing to DevSecOps</h2>
<p>At SignPath, our mission is to relentlessly simplify and abstract the complexity out of code signing infrastructure. The result is automated, authenticated builds that eliminate friction from the development process, while significantly improving software supply chain security. Our robust and flexible mechanism is a natural fit for modern CI/CD pipelines and software development practices.</p>
<p>We created a solution that enables teams to:</p>
<ul>
<li>Neatly integrate code signing into your existing CI/CD pipelines: we enable fully automated code signing workflows. Our solution eliminates many maintenance headaches by avoiding cumbersome ad-hoc scripts that need to securely handle cryptographic providers and tools. Metadata from the CI system can easily be attached to signing requests, providing additional context. This allows you to know exactly what got signed and enhances transparency.</li>
<li>Deep sign software packages: we allow to sign multiple artifacts—such as executables, packages, SBOMs, or files within packages—in a single request. This feature is particularly useful when both application files and the package (or installer) require signing, which often creates a challenging dependency between the build process and the code signing process.</li>
<li>Centrally manage signing policies: organizations can define comprehensive, fine-grained signing policies in a central location. These policies control permissions, approvals, and origin verification, ensuring every signing operation follows strict security guidelines. Regardless of the tech stack, build process, or signing methods used, all rules are declared in a single location.</li>
</ul>
<p>As a result, you gain strong cryptographic assurance that every software release:</p>
<ul>
<li>Can be comprehensively traced back to a specific source code version</li>
<li>Meets all policy requirements, including those for reviews and testing</li>
<li>Originates from secure infrastructure that resists tampering, from within or without</li>
</ul>
<h2 id="wrap-up">Wrap Up</h2>
<p>Software supply chain attacks are on the rise. It’s time to raise our collective standards and apply the same security mantra when building software as when we deliver it: trust but verify, making the implicit explicit.</p>
<p>By making code signing an intrinsic part of the DevSecOps framework, organizations can significantly reduce the risk of tampering, ensure compliance, and build resilient software supply chains.</p>
<p>Thanks to years of experience implementing code signing into the most complex infrastructure, SignPath is proud to play a unique role in bringing code signing to DevSecOps.</p>
<p>To experience the distinctive advantages SignPath offers, <a href="https://forms.gle/sAHSsxgASx2BYPzc9">request a demo</a> today.</p>]]></content><author><name>Paul Savoie</name></author><category term="blog" /><summary type="html"><![CDATA[description]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://about.fqa.test.signpath.io/2024-09-09-bg" /><media:content medium="image" url="https://about.fqa.test.signpath.io/2024-09-09-bg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">New year, new faces: SignPath expands the market going activities</title><link href="https://about.fqa.test.signpath.io/blog/2024/02/22/new-year-new-faces" rel="alternate" type="text/html" title="New year, new faces: SignPath expands the market going activities" /><published>2024-02-22T08:00:00+00:00</published><updated>2024-02-22T08:00:00+00:00</updated><id>https://about.fqa.test.signpath.io/blog/2024/02/22/new-year-new-faces</id><content type="html" xml:base="https://about.fqa.test.signpath.io/blog/2024/02/22/new-year-new-faces"><![CDATA[<p><strong>SignPath, the leading provider of advanced code signing solutions and build protection, will boost and expand the market going activities. With this move, SignPath also grows its leadership team.</strong></p>
<p>It’s official now: SignPath, based in Vienna, Austria, has been fully independent since the end of 2023. The company was initially founded in 2017 as a subsidiary of RUBICON IT GmbH, a leading European software development company.</p>
<p>In order to position itself more broadly as supplier of code integrity and SW supply chain security and to enter the international market more strongly, SignPath expanded its management team with the beginning of 2024:</p>
<figure class="image">
<img src="/assets/posts/2024-02-19_new_year_new_faces.jpg" alt="New management team: Paul Savoie, Stefan Wenig, Stephan Brack" />
<figcaption>
<p>New management team: Paul Savoie, Stefan Wenig, Stephan Brack</p>
</figcaption>
</figure>
<p><strong>CEO Stefan Wenig</strong> will continue to lead the positioning, product strategy and thought leadership within the company. Stefan has more than 25 years of experience as a software architect and development manager in the IT industry.</p>
<p>New to the management team is <strong>Chief Sales Officer (CSO) Stephan Brack</strong>. Stephan has wide ranging experience in positioning and selling security products with his own company Protected Networks. He will focus on growing and supporting the sales and reseller team. He will also establish the relevant content activities to support customers and teams.</p>
<p><strong>Chief Product Officer (CPO) Paul Savoie</strong> will strengthen SignPath’s leadership position in the growing market of application security and code integrity. Paul has been with SignPath since 2018. He will use his experience as an entrepreneur and product designer to build on and expand the vision of SignPath to address current and future security challenges.</p>
<p>“I am excited that we can now start the new year at full speed to expand and optimize our product portfolio,” says SignPath CEO Stefan Wenig. “We’ll continue to drive forward our zero-trust vision in software production and thus ensure greater security in the entire IT environment.”</p>
<p>For further information contact us at <a href="mailto:info@signpath.io">info@signpath.io</a></p>]]></content><author><name>Klaus Rathje</name></author><category term="blog" /><summary type="html"><![CDATA[SignPath, the leading provider of advanced code signing solutions and build protection, will boost and expand the market going activities. With this move, SignPath also grows its leadership team.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://about.fqa.test.signpath.io/2024-02-19-bg" /><media:content medium="image" url="https://about.fqa.test.signpath.io/2024-02-19-bg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Cybernews interview with our CEO: supply chains and code signing</title><link href="https://about.fqa.test.signpath.io/blog/2022/02/21/cybernews-interview" rel="alternate" type="text/html" title="Cybernews interview with our CEO: supply chains and code signing" /><published>2022-02-21T16:00:00+00:00</published><updated>2022-02-21T16:00:00+00:00</updated><id>https://about.fqa.test.signpath.io/blog/2022/02/21/cybernews-interview</id><content type="html" xml:base="https://about.fqa.test.signpath.io/blog/2022/02/21/cybernews-interview"><![CDATA[<p>Read the original interview at <a href="https://cybernews.com/security/stefan-wenig-signpath-you-can-spend-millions-of-dollars-for-it-security-and-still-become-a-victim-of-an-attack-on-a-supplier/">Cybernews</a>.</p>
<p><strong><em>If there’s anything that companies shouldn’t put aside, it’s cybersecurity because threats are lurking around every corner of the cyber world.</em></strong></p>
<p><em>Businesses of any size can unexpectedly fall victim to various dangers, including ransomware attacks, data breaches, or identity theft. It can lead to demanding a financial ransom, loss of sensitive data, or damaged brand reputation.</em></p>
<p><em>And while it’s only a matter of time when a company will encounter cybercriminals, there are various tools, such as secure code signing services, that can be used to prevent attacks.</em></p>
<p><em>For this reason, <a href="https://cybernews.com/">Cybernews</a> invited Stefan Wenig, the CEO of SignPath – a company that specializes in safe code signing. Wenig shared his views about cybersecurity and the importance of code signing.</em></p>
<p><strong>How did SignPath originate? What has your journey been like so far?</strong></p>
<p>Back when I was in charge of software development at <a href="https://www.rubicon.eu/">RUBICON</a>, we had to sign some client-side add-ons for our enterprise Web applications. We purchased a certificate, which came on a USB token. That requires a lot of driver and configuration fumbling and is impossible to integrate with build automation. You end up with a tedious, insecure, manual process.</p>
<p>By then, code signing incidents had started making serious news. Initially, this was about key theft, but more recent cases pointed towards a different problem – how do you make sure that you only sign software releases that were built in a secure way, from a properly reviewed source, according to all your policies?</p>
<p>Later, some government and enterprise customers started asking for signatures on server applications too, so integration and automation became a key concern. Some customers prescribed secure code signing practices, so now it had to be automatic and secure, across a wide range of products and projects.</p>
<p>We didn’t find a good solution on the market, so we experimented with a signing service that offloads code signing to a secure, isolated service. That turned out to be harder than we thought, so we knew early on that if we were going to do this, we’d build it as a product.</p>
<p>Fast forward 5 years, SignPath is now an independent company, providing this solution as a Cloud service and on-premises to SMBs and large enterprises.</p>
<p><strong>Can you tell us a little bit about what you do? Why is code signing needed?</strong></p>
<p>Code signing is the process of putting a digital signature on a program before it is distributed, installed, and executed. Like a digital signature on a document, it guarantees authenticity and integrity. Authenticity means that files have been signed by the software’s publisher. Integrity means that the program’s files have not been modified since.</p>
<p>Most current platforms automatically verify signatures when programs and apps are downloaded and installed. They all bring their specific formats and procedures: Microsoft, Java, Google, and Apple all use certificate-based formats. This allows customers to verify the legal name and country of the publisher. Linux and Docker do it a bit differently.</p>
<p>Code signing can ensure that only trusted programs are installed and run on any computer or smartphone. So, at least in theory, if a program is properly signed by a trusted publisher, you should be able to trust it. But there are pitfalls.</p>
<p><strong>How do cybercriminals take advantage of software that isn’t code signed? What is the worst that can happen?</strong></p>
<p>Whether it’s viruses, Trojan horses, or targeted attacks where hackers directly attack an organizations’ systems: cybercriminals try to get your computer to run their own programs. Consumers face ransomware, theft of identity or financial data, or permanent takeovers of their computers into botnets. Corporations, government agencies, and other organizations also find that these programs are used to open backdoors for further attacks and to infect other computers in the network.</p>
<p>A widely known defense against malicious code is anti-malware programs. They detect and prevent some breaches, but this is an arms race: malware changes all the time to stay undetected. And it’s virtually impossible for anti-malware tools to detect a targeted attack.</p>
<p>That’s why modern operating systems go out of their way to prevent untrusted programs from running at all. Code signing is the primary mechanism to ensure and verify this trust.</p>
<p><strong>Do you think the recent global events altered the way people perceive cybersecurity?</strong></p>
<p>Absolutely, in two ways. First, there was the <a href="https://cybernews.com/editorial/us-secret-service-vet-solarwinds-was-cybersecurity-s-9-11/">Sunburst</a> incident. In 2020, Russian hackers had infiltrated the software company SolarWinds and inserted a backdoor into their software. The software is used by tens of thousands of organizations, so they all had these backdoors opened. The incident went undetected for many months before security specialist FireEye found it.</p>
<p>Sunburst also demonstrated that code signing is not enough. The modified software was properly signed, and not for lack of a safe signing infrastructure. The incident happened somewhere else in the build pipeline, but there was no connection between build policies and signing. In fact, signing only made it worse – it conveyed that it should be trusted when it was already weaponized.</p>
<p>Similar incidents have happened before and since, but Sunburst made the big news. Finally, everybody realized how susceptible even the most security-savvy organizations are to supply-chain attacks. In other words, you can spend tens or hundreds of millions of dollars for IT security and still become a victim of an attack on a supplier, or on your own software development teams.</p>
<p>And of course, more recent events put us all in a dangerous place. Everyone is very aware that current conflicts are fought not only on real battlefields, but economics and the Internet are weaponized, too. This is not just about losing credit card data anymore.</p>
<p><strong>In your opinion, which industries are especially vulnerable to attacks carried out by injecting malicious code?</strong></p>
<p>The most advanced attacks these days come from nation-state actors. So obviously, government agencies and their contractors are high on the list. When it comes to attacking nations, critical industries include defense, aerospace, finance, health care, and energy, especially nuclear.</p>
<p>But nation-states participate in economically motivated attacks, too. And of course, classic cybercrime is still a thing. No industry is spared when it comes to stealing customer data or blackmail.</p>
<p>Modern supply-chain attacks specifically expose the tech industry. If the first move is at a software company, attackers can potentially get at their customers. And since today all technology comes with software, this threat extends to the entire tech industry.</p>
<p><strong>Why do you think code signing is often overlooked? Are there any other security details that you believe are often pushed to the background?</strong></p>
<p>When you ship software to consumers, code signing is often required, so that’s an obvious starting point for many organizations. However, software running on servers usually doesn’t require code signing. So naturally, enterprise software and Web applications are often used without code signing.</p>
<p>Sometimes software is signed, but the signatures are not routinely and automatically verified as part of the deployment procedures. Reasons for this include awareness, budgets, and prioritizing, but many IT administrators will also tell you that they don’t do it because too many programs or components are not signed, and they don’t see the point of only partial verification.</p>
<p>Obviously, both problems expose the servers and networks to significant risk. Installing software of unverified origin naturally means that attackers can attack the process at any point of their choosing.</p>
<p><strong>What are the best practices companies should follow when developing and launching software?</strong></p>
<p>Let’s start with the obvious – all software should be signed before it leaves the publisher’s environment. This allows customers (or operations teams) to verify its origin and integrity.</p>
<p>The next item on everyone’s checklist is to secure this process. Like almost everything in cryptography, code signing relies on the secrecy of private keys. If they get stolen, somebody else will be able to sign their own programs (or modified versions of your software) in your name. We already see secure key management for code signing as an emerging product category.</p>
<p>But signing your software is only half the story. Our industry is increasingly aware of the need to design, code, and test software for security. But less thought is given to the fact that these processes are under attack. New versions of software are being built all the time, and every time you build a program, there’s that risk of somebody messing with the process.</p>
<p>You want to make sure that only software versions that are built in a secure way get signed. And that’s really the hard part of code signing. How do you make sure that the process didn’t get compromised before you do the final signature?</p>
<p>You also want to have some control over releases, such as creating release candidates and delaying the decision to sign and release. That might include automatic testing gateways or manual approvals. However, automatic verifications still have to happen immediately.</p>
<p><strong>What cyber threats do you find the most concerning nowadays? What can average individuals do to protect themselves?</strong></p>
<p>On an enterprise level, supply chain attacks are changing what we have learned about security. When you cannot trust your suppliers, how are you going to defend your own organization? It’s no longer enough to have systems and processes in place that protect your own infrastructure, you also need to worry about who you’re buying from.</p>
<p>Is the software actually from the right source and free from modifications? It’s easy to check a signature. But you also start to worry about which publishers (or which certificates) to trust. Controlling the supply chains of software products is not an easy task. We see formal processes around that, with enterprises starting to track their supply chains, define technical and legal requirements and align incident processes. But software is made of software, so each supplier has their own supply chain, and that’s a difficult thing to track with lists and contracts.</p>
<p>We need to get to the point where an automatic signature verification can convey security properties, such as the policies that were used when making and signing the software.</p>
<p>For individuals, the best advice we can give is what you’re hearing everywhere: don’t mess with security defaults, don’t trust anything on the Internet, and take warnings seriously.</p>
<p><strong>What does the future hold for SignPath?</strong></p>
<p>We’re always improving our core product. With a new release every other week, expect us to keep adding new platforms and formats, integrations, policies, and other features.</p>
<p>We’ll also extend our <a href="https://signpath.org/">SignPath Foundation</a> initiative. Through the Foundation, we’re offering free code signing to Open Source projects. We provide certificates for each project, but they are issued in our name, and the Foundation is in control of the signing policies. We’re using the same functionality that enterprises would deploy to stay in control of certificates and policies across their individual teams.</p>
<p>This allows users to easily trust the downloadable program packages just as much as they trust the source code hosted on GitHub. Open Source signing is currently a limited offering, and we plan to automate the onboarding process and open it up.</p>
<p>Another focus area will be software supply chains. Standards around software bills of material and provenance documentation are currently evolving, and we’re going to see them adopted in a way that allows interoperability across the supply chain.</p>
<p>Producing and signing these documents will be a priority for SignPath. Eventually, we’ll also want to verify this data for incoming components and allow our customers to build signing policies based on it. This will be a major milestone in our mission to provide verifiable trust in the software supply chain.</p>]]></content><author><name>Stefan Wenig</name></author><category term="blog" /><summary type="html"><![CDATA[Read the original interview at Cybernews.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://about.fqa.test.signpath.io/2022-03-21_01-bg" /><media:content medium="image" url="https://about.fqa.test.signpath.io/2022-03-21_01-bg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">DP API Encryption Ineffective in Windows Containers</title><link href="https://about.fqa.test.signpath.io/blog/2021/03/23/dp-api-encryption-ineffective-in-windows-containers" rel="alternate" type="text/html" title="DP API Encryption Ineffective in Windows Containers" /><published>2021-03-23T08:00:01+00:00</published><updated>2021-03-23T08:00:01+00:00</updated><id>https://about.fqa.test.signpath.io/blog/2021/03/23/dp-api-encryption-ineffective-in-windows-containers</id><content type="html" xml:base="https://about.fqa.test.signpath.io/blog/2021/03/23/dp-api-encryption-ineffective-in-windows-containers"><![CDATA[<p>We recently discovered a vulnerability in the key management of Windows containers. Windows containers used publicly available cryptographic keys when encrypting with the Windows Data Protection API (DP API). Furthermore, keys used in different containers by different organizations were the same. This vulnerability allowed attackers to decrypt any data that was encrypted with DP API keys in Windows containers. The vulnerability was confirmed by Microsoft and assigned <a href="https://msrc.microsoft.com/update-guide/vulnerability/CVE-2021-1645">CVE-2021-1645</a>.</p>
<p>In this blog post we describe CVE-2021-1645. This vulnerability was <a href="https://msrc.microsoft.com/update-guide/vulnerability/CVE-2021-1645#acknowledgements">discovered in close cooperation</a> with Marc Nimmerrichter from <a href="https://certitude.consulting/en">Certitude Consulting GmbH</a>. To understand the vulnerability, one has to have a basic understanding of DP API.</p>
<h2 id="dp-api">DP API</h2>
<p>The DP API allows applications to encrypt arbitrary data. An application does not have to manage keys. Instead, any data can be passed to the API, which then returns an encrypted blob. Similarly, an application can pass a previously encrypted blob to retrieve the plain text. The key used for these encryption operations is either tied to the user context or is unique to a machine (please refer to <a href="#1">[1]</a> or <a href="#2">[2]</a> for more details).</p>
<h2 id="cve-2021-1645-and-its-impact">CVE-2021-1645 and its Impact</h2>
<p>CVE-2021-1645 applies to both, user and machine key DP API encryption within Windows Docker containers. In our explanation and PoC we will use machine key encryption, but the same issue exists if data is encrypted with the user key.</p>
<p>Normally, a machine key is tied to a (virtual-)machine. Therefore, if an application on machine A encrypted data, it would not be possible to decrypt the data on machine B. When designing the Windows containers feature Microsoft did not sufficiently consider this security behavior. Upon researching DP API in containers we discovered that DP API machine keys were identical for all Windows containers with the same base image. This was due to the fact that DP API machine keys of containers came from the base image. As the base images are public, the DP API machine keys were public too! Therefore, any DP API encryption using the machine key in any Windows containers was worthless.</p>
<p>Organizations that used DP API keys in Windows Docker containers to store encrypted data in a potentially insecure location, should consider this data to be compromised.</p>
<h2 id="proof-of-concept">Proof of Concept</h2>
<p>In the following of this section we demonstrate that any data encrypted by the DP API machine key of a container application can be decrypted in any other container with the same base image. The following test setup utilizes two virtual machines (VM1, VM2) generated from the Azure VM template “Windows Server 2019 Datacenter with Containers- Gen2”. Microsoft already patched the base images in their Dockerhub repository. To reproduce this scenario, old image versions are required.</p>
<p>First, start a docker container called <em>Alice</em> on <em>VM1</em>:</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">docker</span><span class="w"> </span><span class="nx">run</span><span class="w"> </span><span class="nt">--name</span><span class="w"> </span><span class="nx">Alice</span><span class="w"> </span><span class="nt">-it</span><span class="w"> </span><span class="nx">mcr.microsoft.com/dotnet/framework/runtime:4.8-windowsservercore-ltsc2019</span><span class="w"> </span><span class="nx">cmd.exe</span><span class="w">
</span></code></pre></div></div>
<p>Then, encrypt a file in the <em>Alice</em> container using the powershell script vault.ps1 <a href="#3">[3]</a>:</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">C:</span><span class="err">></span><span class="nx">powershell.exe</span><span class="w"> </span><span class="nt">-File</span><span class="w"> </span><span class="nx">vault.ps1</span><span class="w"> </span><span class="nt">-StoreSecret</span><span class="w"> </span><span class="s2">"This is my secret text"</span><span class="w"> </span><span class="nx">secret.txt</span><span class="w">
</span><span class="n">C:</span><span class="err">></span><span class="nx">type</span><span class="w"> </span><span class="nx">secret.txt</span><span class="w">
</span><span class="n">AQAAANCMnd8BFdERjHoAwE/Cl</span><span class="o">+</span><span class="nx">sBAAAAm</span><span class="o">+</span><span class="nx">1a2TNbiEahEIB4y/C3vQAAAAACAAAAAAAQZgAAAAEAACAAAAAdbJ9ZanY929j39ZLgabsaE5hRS4TLkCaaaRqb</span><span class="w">
</span><span class="o">+</span><span class="n">n3ZXAAAAAAOgAAAAAIAACAAAAC7fHbsKHCTaMhsWIVMYwUZezbLozItcqExHdg9EJcfDiAAAABFv2EHA5TTqb8I9I</span><span class="o">+</span><span class="nx">BZrfQS5ViD93KZlL4FoYIBldGY0AA</span><span class="w">
</span><span class="n">AABdx7adlANRnw1shJTOtE6cYTAeqmb1yTe9adcSY1nBvtqlqSWQ/zwGaqfIfumuUm</span><span class="o">+</span><span class="nx">o</span><span class="o">+</span><span class="nx">ySwZXH/Su5GovJ8aUP9</span><span class="w">
</span></code></pre></div></div>
<p>Start a docker container <em>Bob</em> on <em>VM2</em>:</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">docker</span><span class="w"> </span><span class="nx">run</span><span class="w"> </span><span class="nt">--name</span><span class="w"> </span><span class="nx">Bob</span><span class="w"> </span><span class="nt">-it</span><span class="w"> </span><span class="nx">mcr.microsoft.com/dotnet/framework/runtime:4.8-windowsservercore-ltsc2019</span><span class="w"> </span><span class="nx">cmd.exe</span><span class="w">
</span></code></pre></div></div>
<p>The following command shows that the file can be decrypted in the <em>Bob</em> container:</p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="n">C:</span><span class="err">></span><span class="nx">powershell.exe</span><span class="w"> </span><span class="nt">-File</span><span class="w"> </span><span class="nx">vault.ps1</span><span class="w"> </span><span class="nx">secret.txt</span><span class="w">
</span><span class="n">This</span><span class="w"> </span><span class="nx">is</span><span class="w"> </span><span class="nx">my</span><span class="w"> </span><span class="nx">secret</span><span class="w"> </span><span class="nx">text</span><span class="w">
</span></code></pre></div></div>
<h2 id="security-patch">Security Patch</h2>
<p>Microsoft fixed CVE-2021-1645 with the Microsoft Patch Tuesday of January 2021. Affected users should apply both, OS updates and base-image updates, to address this issue.</p>
<p>Unfortunately, the patch comes with a caveat: The vulnerability appears to be due to a design problem. It could not be fixed in a straightforward way. After the patch Windows containers generate DP API keys when the container is first started. This means that all containers use different keys. There is currently no way to share keys between containers or transfer a key from one container to another. This is impractical as containers are often relatively short-lived. Moreover, when a container is scaled up, new containers will not be able to work with previously encrypted blobs.</p>
<p>As a result, the DP API is currently of limited use in containers.</p>
<div class="sources">
<h1 id="sources">Sources</h1>
<ul>
<li>[<span id="1">1</span>] <a href="https://www.passcape.com/index.php?id=28&section=docsys&cmd=details">https://www.passcape.com/index.php?id=28&section=docsys&cmd=details</a></li>
<li>[<span id="2">2</span>] <a href="https://docs.microsoft.com/en-us/previous-versions/ms995355(v=msdn.10)?redirectedfrom=MSDN">https://docs.microsoft.com/en-us/previous-versions/ms995355(v=msdn.10)?redirectedfrom=MSDN</a></li>
<li>[<span id="3">3</span>] <a href="https://blag.nullteilerfrei.de/2018/01/05/powershell-dpapi-script/">https://blag.nullteilerfrei.de/2018/01/05/powershell-dpapi-script/</a></li>
</ul>
</div>
<h1 id="appendix">Appendix</h1>
<p><code class="language-plaintext highlighter-rouge">Vault.ps1</code></p>
<p>Script from <a href="#3">[3]</a></p>
<div class="language-powershell highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kr">Param</span><span class="p">(</span><span class="w">
</span><span class="p">[</span><span class="n">string</span><span class="p">]</span><span class="w"> </span><span class="nv">$StoreSecret</span><span class="p">,</span><span class="w">
</span><span class="p">[</span><span class="n">Parameter</span><span class="p">(</span><span class="n">Mandatory</span><span class="o">=</span><span class="nv">$True</span><span class="p">,</span><span class="n">Position</span><span class="o">=</span><span class="mi">0</span><span class="p">)]</span><span class="w">
</span><span class="p">[</span><span class="n">string</span><span class="p">]</span><span class="w"> </span><span class="nv">$filename</span><span class="w"> </span><span class="p">)</span><span class="w">
</span><span class="p">[</span><span class="n">void</span><span class="p">]</span><span class="w"> </span><span class="p">[</span><span class="n">Reflection.Assembly</span><span class="p">]::</span><span class="n">LoadWithPartialName</span><span class="p">(</span><span class="s2">"System.Security"</span><span class="p">)</span><span class="w">
</span><span class="nv">$scope</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">[</span><span class="n">System.Security.Cryptography.DataProtectionScope</span><span class="p">]::</span><span class="n">CurrentUser</span><span class="w">
</span><span class="nx">if</span><span class="w"> </span><span class="p">(</span><span class="nv">$StoreSecret</span><span class="w"> </span><span class="o">-eq</span><span class="w"> </span><span class="s2">""</span><span class="p">)</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="nv">$data</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="n">Get-Content</span><span class="w"> </span><span class="nv">$filename</span><span class="w">
</span><span class="nv">$ciphertext</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">[</span><span class="n">System.Convert</span><span class="p">]::</span><span class="n">FromBase64String</span><span class="p">(</span><span class="nv">$data</span><span class="p">)</span><span class="w">
</span><span class="nv">$plaintext</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">[</span><span class="n">System.Security.Cryptography.ProtectedData</span><span class="p">]::</span><span class="n">Unprotect</span><span class="p">(</span><span class="w">
</span><span class="nv">$ciphertext</span><span class="p">,</span><span class="w"> </span><span class="bp">$null</span><span class="p">,</span><span class="w"> </span><span class="nv">$scope</span><span class="w"> </span><span class="p">)</span><span class="w">
</span><span class="p">[</span><span class="n">System.Text.UTF8Encoding</span><span class="p">]::</span><span class="n">UTF8.GetString</span><span class="p">(</span><span class="nv">$plaintext</span><span class="p">)</span><span class="w">
</span><span class="p">}</span><span class="w"> </span><span class="kr">else</span><span class="w"> </span><span class="p">{</span><span class="w">
</span><span class="nv">$plaintext</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">[</span><span class="n">System.Text.UTF8Encoding</span><span class="p">]::</span><span class="n">UTF8.GetBytes</span><span class="p">(</span><span class="nv">$StoreSecret</span><span class="p">)</span><span class="w">
</span><span class="nv">$ciphertext</span><span class="w"> </span><span class="o">=</span><span class="w"> </span><span class="p">[</span><span class="n">System.Security.Cryptography.ProtectedData</span><span class="p">]::</span><span class="n">Protect</span><span class="p">(</span><span class="w">
</span><span class="nv">$plaintext</span><span class="p">,</span><span class="w"> </span><span class="bp">$null</span><span class="p">,</span><span class="w"> </span><span class="nv">$scope</span><span class="w"> </span><span class="p">)</span><span class="w">
</span><span class="p">[</span><span class="n">System.Convert</span><span class="p">]::</span><span class="n">ToBase64String</span><span class="p">(</span><span class="nv">$ciphertext</span><span class="p">)</span><span class="w"> </span><span class="err">></span><span class="w"> </span><span class="nv">$filename</span><span class="w">
</span><span class="p">}</span><span class="w">
</span></code></pre></div></div>]]></content><author><name>Marc Nimmerrichter</name></author><category term="blog" /><summary type="html"><![CDATA[We recently discovered a vulnerability in the key management of Windows containers. Windows containers used publicly available cryptographic keys when encrypting with the Windows Data Protection API (DP API). Furthermore, keys used in different containers by different organizations were the same. This vulnerability allowed attackers to decrypt any data that was encrypted with DP API keys in Windows containers. The vulnerability was confirmed by Microsoft and assigned CVE-2021-1645.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://about.fqa.test.signpath.io/2021-03-23_02-bg" /><media:content medium="image" url="https://about.fqa.test.signpath.io/2021-03-23_02-bg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Experiences with Security Report Handling: The Good and the Bad</title><link href="https://about.fqa.test.signpath.io/blog/2021/03/23/experiences-with-security-report-handling" rel="alternate" type="text/html" title="Experiences with Security Report Handling: The Good and the Bad" /><published>2021-03-23T08:00:00+00:00</published><updated>2021-03-23T08:00:00+00:00</updated><id>https://about.fqa.test.signpath.io/blog/2021/03/23/experiences-with-security-report-handling</id><content type="html" xml:base="https://about.fqa.test.signpath.io/blog/2021/03/23/experiences-with-security-report-handling"><![CDATA[<p>When reporting the vulnerability/security issue described in <a href="https://about.signpath.io/blog/2020/08/26/on-the-importance-of-trust-validation.html">On the Importance of Trust Validation: Microsoft’s Dangerous Mistake</a> and <a href="https://about.signpath.io/blog/2020/08/26/unfulfilled-expectations.html">Unfulfilled Exceptations: Revoked Certificates in JAR Signing</a> we noticed big differences in security report handling between the contacted vendors. In this blog post we are talking about our experiences with these security reports. The blog post is written in cooperation with Marc Nimmerichter from <a href="https://certitude.consulting/en">Certitude</a>.</p>
<h2 id="reporting-the-vsix-installer-vulnerability">Reporting the VSIX Installer Vulnerability</h2>
<p>We reported the vulnerability to Microsoft shortly after its discovery. Microsoft is not only the developer of the VSIX installer, but also the responsible CVE Numbering Authority (CNA) for Microsoft products <a href="#1">[1]</a>, i.e. Microsoft is responsible for issuing CVEs for the VSIX installer. Microsoft has confirmed the vulnerability and has rated its severity as <strong>moderate</strong>. Microsoft provided the following explanation for the severity of the vulnerability:</p>
<ol>
<li>Revocation checking does not completely mitigate a stolen private key scenario anyway.</li>
<li>No popular publisher of VSIX packages has had their private key stolen.</li>
</ol>
<p>Microsoft stated this vulnerability <strong>does not warrant the issuing of a CVE</strong> because it only has <strong>moderate severity</strong>.</p>
<p>While we agree with the first argument, the second remains a mystery:</p>
<ul>
<li>It’s unclear to us how Microsoft would know about all stolen private keys, as it is not obligatory to report stolen keys to Microsoft</li>
<li>There are no specific code-signing certificates for VSIX, so this vulnerability could not be exploited with private keys that were stolen any software publisher, not only “popular package publishers” (see <a href="https://about.signpath.io/blog/2020/08/26/on-the-importance-of-trust-validation.html">On the Importance of Trust Validation: Microsoft’s Dangerous Mistake</a>)</li>
</ul>
<p>Furthermore, there is no reason to assume that CVEs should only be issued for vulnerabilities rated higher than moderate. If that were the case, users would be restricted in their ability to protect themselves from moderate vulnerabilities. It should be noted that CVEs are frequently issued for vulnerabilities with moderate severity (e.g. <a href="https://nvd.nist.gov/vuln/detail/CVE-2019-1020019">CVE-2019-1020019</a>, <a href="https://nvd.nist.gov/vuln/detail/CVE-2020-12880">CVE-2020-128809</a>, or <a href="https://nvd.nist.gov/vuln/detail/CVE-2020-10643">CVE-2020-10643</a>).</p>
<p>Unfortunately, Microsoft’s decision on issuing a CVE was not up for discussion, so we contacted MITRE to obtain a CVE. MITRE is a root CNA and is responsible for all CVEs that are not covered by other CNAs <a href="#1">[1]</a>. After a short exchange of emails, MITRE stopped responding to us, even after another follow up on our behalf. Almost a year later, in late April 2020, we unexpectedly received a message by MITRE, asking us if we would still like to have a CVE number assigned. Despite confirming that we suggest to assign a CVE, MITRE has not assigned this vulnerability a CVE number yet.</p>
<h3 id="intransparent-fix">Intransparent Fix</h3>
<p>Microsoft has informed us that they were planning to fix this vulnerability with Visual Studio 16.3, which was released in Fall 2019. The release notes of Visual Studio 16.3. did not mention the vulnerability in any way <a href="#2">[2]</a>. However, in May 2020 we could confirm that the vulnerability is fixed at least in versions >=16.5.2047.</p>
<h3 id="timeline">Timeline</h3>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>15/06/2019: Initial discovery of the vulnerability and subsequent confirmation
08/07/2019: Reporting the vulnerability to Microsoft
21/07/2019: Microsoft confirms the vulnerability
08/08/2019: Microsoft plans to address the vulnerability in Visual Studio 16.3
14/08/2019: Trying to obtain a CVE from MITRE since Microsoft is very reluctant to issue a CVE
20/08/2019: Microsoft refuses to issue a CVE
20/08/2019: No more responses from MITRE until 28/04/2020
23/09/2019: Release of Visual Studio 16.3 (supposedly fixed version)
28/04/2020: MITRE asks if we still want a CVE
11/05/2020: Confirming to MITRE that we still want a CVE
11/05/2020 - 23/03/2021: After further requests to MITRE no CVE was assigned
</code></pre></div></div>
<h2 id="reporting-the-jarsigner-security-issue">Reporting the JarSigner Security Issue</h2>
<p>Shortly after discovery of the missing revocation check in the JarSigner, we informed Oracle about it. Oracle quickly agreed that this could be an issue and informed us that they will address it by providing an option for CRL checks in the JarSigner. As the security issue is conceptional and not technically a vulnerability, we did not try to obtain a CVE for it.</p>
<h3 id="timeline-1">Timeline</h3>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>24/03/2020: Initial discovery of the security issue and subsequent confirmation
26/03/2020: Reporting the security issue to Oracle
20/04/2020: Oracle confirms the security issue and plans to address it in JDK15
</code></pre></div></div>
<h2 id="security-report-handling-microsoft-vs-oracle">Security Report Handling: Microsoft vs. Oracle</h2>
<p>The security report handling between Microsoft and Oracle could hardly be more different.</p>
<p>While discussing the vulnerability with Microsoft, they tried to disregard the vulnerability - seemingly without fully understanding it at first. After a more thorough explanation, Microsoft agreed to issue a fix for this vulnerability, but, in our opinion, downplayed the severity of the vulnerability with at least partially invalid arguments (see above). Based on this arguments, they argued moderate severity and that no CVE was warranted. The unwillingness to asign a CVE and the absence of a mention of this vulnerability in the Visual Studio 16.3 release notes casts some doubts on their priorities: would they rather keep a vulnerability under cover, and produce a silent fix after a long delay, or would they keep their users informed to help them be secure? The former approach is commonly considered bad practice.</p>
<p>In contrast to Microsoft’s security report handling, Oracle has acknowledged the security issue, without trying to avoid to take action or unreasonably downplaying it, and is transparently fixing it through a public ticket system. Although we wished Oracle had made automatic CRL checks a default in the JarSigner, in our opinion, Oracle handled this security report very well overall.</p>
<div class="sources">
<h1 id="sources">Sources</h1>
<ul>
<li>[<span id="1">1</span>] <a href="https://cve.mitre.org/cve/request_id.html">https://cve.mitre.org/cve/request_id.html</a></li>
<li>[<span id="2">2</span>] <a href="https://docs.microsoft.com/en-us/visualstudio/releases/2019/release-notes-v16.3">https://docs.microsoft.com/en-us/visualstudio/releases/2019/release-notes-v16.3</a></li>
</ul>
</div>]]></content><author><name>Daniel Ostovary</name></author><category term="blog" /><summary type="html"><![CDATA[When reporting the vulnerability/security issue described in On the Importance of Trust Validation: Microsoft’s Dangerous Mistake and Unfulfilled Exceptations: Revoked Certificates in JAR Signing we noticed big differences in security report handling between the contacted vendors. In this blog post we are talking about our experiences with these security reports. The blog post is written in cooperation with Marc Nimmerichter from Certitude.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://about.fqa.test.signpath.io/2021-03-23_01-bg" /><media:content medium="image" url="https://about.fqa.test.signpath.io/2021-03-23_01-bg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Evaluating the Sunburst Hack: Causes and Future Prevention</title><link href="https://about.fqa.test.signpath.io/blog/2020/12/21/evaluating-sunburst" rel="alternate" type="text/html" title="Evaluating the Sunburst Hack: Causes and Future Prevention" /><published>2020-12-21T08:00:00+00:00</published><updated>2020-12-21T08:00:00+00:00</updated><id>https://about.fqa.test.signpath.io/blog/2020/12/21/evaluating-sunburst</id><content type="html" xml:base="https://about.fqa.test.signpath.io/blog/2020/12/21/evaluating-sunburst"><![CDATA[<p>Sunburst/Solorigate is already the most discussed hacker attack in living memory, maybe back to when Iran’s nuclear program was pushed back years by Stuxnet. But what do we really know?</p>
<p>Several U.S. agencies were infiltrated using versions of SolarWind’s Orion software that carried a backdoor payload. Given the scope of this software, it’s not surprising that a successful attack on Orion would also pave the way for hacking customers. So rather than examining the exact nature of these backdoors and how they were exploited, let’s look at how the software was hacked in the first place.</p>
<p>Tomislav Peričin of ReversingLabs did an <a href="https://blog.reversinglabs.com/blog/sunburst-the-next-level-of-stealth">exhaustive analysis</a> of this hack. By decompiling several versions of Orion, he showed how the hackers inched their way from a careful proof of concept to a full-blown backdoor through incremental source code modifications.</p>
<h2 id="two-possible-attack-vectors">Two possible attack vectors</h2>
<p>What’s not so clear though is whether these changes were committed to the source code repository, where every developer could see them, or injected on the build system.</p>
<blockquote>
<p><strong>Update:</strong> This article is based on the research by ReverseLabs. SolarWinds has published a <a href="https://sec.report/Document/0001628280-20-017451/">statement</a> that says that the code “was introduced as a result of a compromise of the Orion software build system and <strong>was not present in the source code repository</strong> of the Orion products.”</p>
<p>So it wasn’t attack vector 1. But considering the careful way the code was inserted incrementally, it’s still a vector every development organization has to watch out for.</p>
</blockquote>
<p>Looking forward, this does not really matter. Both ways are possible attack vectors, so we must consider them all.</p>
<h2 id="attack-vector-1-modifying-the-source-code-in-the-repository">Attack vector 1: modifying the source code in the repository</h2>
<p>Now here’s a no-brainer: If an attacker manages to modify the source code without getting caught, the modifications will eventually turn up at customer sites.</p>
<p>So what can be done to avoid this?</p>
<p>In short: create and enforce a task-based review policy.</p>
<p>The code modifications were carefully placed where they would not cause suspicion. For instance, the code responsible for starting the backdoor thread was placed where legitimate background processing was done too. The string obfuscation should alarm careful readers, but that was done in another module, where nobody would have much business anyway.</p>
<p>Those were clever measures for sure, but they would not go undetected in a task-based review. Assume that the team has the following security measures:</p>
<table>
<thead>
<tr>
<th>Policy</th>
<th>Implementation</th>
</tr>
</thead>
<tbody>
<tr>
<td><strong>All source code commits must reference issues</strong> for developer tasks, such as issues or stories</td>
<td>This can be implemented in most version control systems. E.g. Git provides <a href="https://git-scm.com/book/en/v2/Customizing-Git-An-Example-Git-Enforced-Policy">server-side</a> <a href="https://stackoverflow.com/questions/14151775/how-do-i-set-a-pattern-for-git-commit-messages">hooks</a>. On top of this, some Git servers provide configurations such as <a href="https://docs.gitlab.com/ee/push_rules/push_rules.html#commit-messages-with-a-specific-reference">regex rules</a> for commit messages.</td>
</tr>
<tr>
<td>The source code system is configured to <strong>accept merges on release-branches only after successful code reviews</strong></td>
<td>Again, you can use basic hooks, or features such as <a href="https://docs.github.com/en/github/administering-a-repository/managing-a-branch-protection-rule">branch</a> <a href="https://docs.gitlab.com/ee/user/project/protected_branches.html">protection</a>. Or use <a href="https://en.wikipedia.org/wiki/List_of_tools_for_code_review">dedicated review software</a>.</td>
</tr>
<tr>
<td>The code signing process only signs builds from release branches</td>
<td>If you do code signing from your CI system, make sure it is only configured for reviewed branches. <strong>SignPath can verify the source of a build and enforce your branch policies directly</strong>, so they cannot be sidestepped by careless misconfigurations or hackers with CI access.</td>
</tr>
</tbody>
</table>
<p>It’s easy to see how these hacks could avoid accidental detection. But to work around a carefully implemented review policy, the hackers would have to hack the issue tracking system too (quite possible), create plausible tasks for creating a new background process thread (hm, what for?) and string obfuscation - now that should trigger some alarms!</p>
<p>Also, make sure that your review guidelines contain clear rules for triggering security reviews. A very fine example can be found <a href="https://docs.gitlab.com/ee/development/code_review.html">here</a>.</p>
<p>Thorough source code reviews require some effort. But you don’t do it just for security, you do it for quality. Like most reasonable quality measures, they will pay off with reduced cost for support, troubleshooting and bug fixing. In 2020, there’s hardly any excuse left for not doing them, so why not go all the way and enforce them?</p>
<h2 id="attack-vector-2-modifying-the-source-code-at-build-time">Attack vector 2: modifying the source code at build time</h2>
<p>Now making sure that your source code repository contains only what it should is an obvious step. But what if the hackers didn’t commit these changes after all? <strong>What if the build infrastructure was compromised</strong> to simply <strong>replace a few files</strong> before compiling the software? <strong>What if it applied post-compilation transformations</strong> to include the payload?</p>
<p>These are routine activities for hackers, the tools are all there too. So if they can get into the build process, or modify the code before signing and releasing, that’s rather easy to achieve.</p>
<p>This attack vector is much harder to close. It requires a <strong>detailed analysis</strong> of the entire <strong>build infrastructure</strong> <em>and</em> <strong>every single project’s build and code signing process</strong>.</p>
<p>Having done this for some years for a living, we can now safely say one thing: <strong>this is hard.</strong></p>
<p>Analyzing entire build pipelines for possible attacks from within the network is rarely done, and it’s a daunting task. There is little know-how around, and still too little vendor support: CI systems are built for speed, scalability and flexibility first.</p>
<p>And there’s a reason for this too: CI systems are used for so many things today, some of them rather resource-intensive too. Think about executing Web test suites, for instance. Or DAST tests. They require a lot of analysis and troubleshooting, so there’s a big incentive for handing out access permissions for components of the CI system too.</p>
<p>To be sure, it’s easy to come to the conclusion that a process is safe. There is this token or password that you need for code signing, and we’re only using it in this specific way, right? Probably not: relying on a single secret and it’s proper handling will almost always leave some room for attacks. But who has the time and budget to do a full security audit for every project?</p>
<p><strong>At SignPath, this was our mission from the start:</strong> Storing keys on HSMs is not the solution for all code signing risks. Neither is a code signing gateway that simply provides HSM access using some finer-grained access control and auditing.</p>
<p>Our customers need a simple way to make sure that <strong>every signed and published release</strong> of their software</p>
<ul>
<li>can be <strong>traced back to a specific source code version</strong>, without room for manipulation</li>
<li><strong>meets all policy requirements</strong>, including reviews and testing</li>
<li>was built on <strong>secure infrastructure</strong>, without direct or indirect developer access</li>
</ul>
<p>To find out more about code signing with SignPath, please <a href="mailto:sales@signpath.io">contact us</a>.</p>
<h3 id="updates">Updates</h3>
<ul>
<li>2020-12-21 8:00 UTC: Added SolarWind’s statement that the modified code was <em>not</em> in their repository.</li>
</ul>]]></content><author><name>Stefan Wenig</name></author><category term="blog" /><summary type="html"><![CDATA[Sunburst/Solorigate is already the most discussed hacker attack in living memory, maybe back to when Iran’s nuclear program was pushed back years by Stuxnet. But what do we really know?]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://about.fqa.test.signpath.io/2020-12-21-bg" /><media:content medium="image" url="https://about.fqa.test.signpath.io/2020-12-21-bg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">On the Importance of Trust Validation: Microsoft’s Dangerous Mistake</title><link href="https://about.fqa.test.signpath.io/blog/2020/08/26/on-the-importance-of-trust-validation" rel="alternate" type="text/html" title="On the Importance of Trust Validation: Microsoft’s Dangerous Mistake" /><published>2020-08-26T00:00:00+00:00</published><updated>2020-08-26T00:00:00+00:00</updated><id>https://about.fqa.test.signpath.io/blog/2020/08/26/on-the-importance-of-trust-validation</id><content type="html" xml:base="https://about.fqa.test.signpath.io/blog/2020/08/26/on-the-importance-of-trust-validation"><![CDATA[<p>Last year we discovered a vulnerability in the Visual Studio Extension (VSIX) installer, which comes with Microsoft Visual Studio. When verifying the signature of a VSIX package, the VSIX installer failed to check the trust of the timestamp under certain circumstances. This vulnerability allowed an attacker to apply a non-trustworthy timestamp without users being warned about it.</p>
<p>In this blog post we are talking about this vulnerability and its impact. The blog post is written in cooperation with Marc Nimmerichter from <a href="https://www.impidio.com/blog/on-the-importance-of-trust-validation-microsofts-dangerous-mistake">Impidio</a>. The here-discussed vulnerability has been reported to Microsoft shortly after its discovery. We will talk about our experiences with reporting this vulnerability in a later blog post. To understand the vulnerability, one has to understand timestamping in code signing first.</p>
<h1 id="timestamping">Timestamping</h1>
<p>Unlike signatures of TLS certificates, signatures of code-signing certificates may be verified a long time after their application (e.g. a software publisher may apply a signature on an .EXE file in 2018; a user may use this .EXE in 2024). Code-signing certificates have a relatively low validity period of 1-3 years <a href="#1">[1]</a>. As a result, upon execution of a program, a user may be presented with a certificate expiration warning because the code-signing certificate is no longer valid at the time the signed program is executed <a href="#2">[2]</a>. Simply giving a code-signing certificate a long validity is considered bad practice. Instead, in code-signing, so-called timestamps are frequently used <a href="#1">[1]</a>. These timestamps are provided by trustworthy Timestamping Authorities (TSAs) and cryptographically guarantee that certain data existed at a certain time <a href="#2">[2]</a>. This means, when applying a timestamp to a signature, it is cryptograhically guaranteed by the issuing TSA that the signature was applied at the time of the timestamp <a href="#2">[2]</a>.</p>
<p>With such a timestamp applied to a signature, <strong>the validity period of a code signature is extended</strong> to the validity period of the TSA’s certificate <a href="#4">[4]</a>. The validity period of a TSA’s certificate is usually relatively long (typically 10-15 years; e.g. see <a href="#5">[5-7]</a>).</p>
<p>Timestamping can also be used to <strong>keep signatures of a revoked code-signing certificate valid if and only if (iff) the revocation date is after the timestamp date</strong>, as suggested by RFC 3161 <a href="#8">[8]</a>.</p>
<p>It is important to check the correctness and trustworthiness of a timestamp <a href="#3">[3]</a>. Otherwise it may be possible to maliciously alter the validity of a code signature.</p>
<h1 id="the-vulnerability">The Vulnerability</h1>
<p>In the case of the VSIX installer, an attacker was able to apply a valid code signature to a file using an expired code-signing certificate and a self-created non-trustworthy TSA. For that, the attacker had to apply a code signature using the expired code-signing ceritificate, backdating the signature to a time when the code-signing certificate was valid, and apply a specially crafted timestamp. This timestamp would have to:</p>
<ol>
<li>contain the certificate chain of the self-created non-trustworthy TSA</li>
<li>be dated before the expiration of the code-signing certificate</li>
</ol>
<p>Before the vulnerability was fixed, the VSIX installer accepted such a combination of a signature and timestamp, i.e. it did not report a non-trustworthy signature/timestamp. This behavior shows that the VSIX installer did not correctly check the trustworthiness of a TSA.</p>
<h1 id="impact">Impact</h1>
<p>We discovered three potential attack scenarios for this vulnerability. One of which is infeasible due to the way the VSIX installer supports timestamping.</p>
<p><strong>Scenario 1 (‘reviving’ expired code-signing certificates):</strong> The vulnerability allows an attacker to craft malicious VSIX packages with ‘revived’ expired code-signing certificates, i.e. <strong>use expired code-signing certificates to sign VSIX packages</strong>. This is problematic because owners of certificates may not protect expired certificates anymore or may not revoke expired code-signing certificates in case of theft because these certificates should be unusable anyway. Attackers can abuse such certificate ‘revival’ to sign malicious code, which would be accepted by the VSIX installer without any warning and subsequently executed by unsuspecting end users. Essentially, in this scenario an attacker would maliciously <strong>extend the validity period of a code-signing certificate</strong>. There are no specific code-signing certificates for VSIX packages. So any code-signing certificate can be used to sign VSIX packages and subsequently any expired code-signing certificate could be used for this attack.</p>
<p>Imagine the following example: On February 20th 2020 an attacker steals a code-signing certificate that expired on January 10th from a software publisher. The software publisher has not detected the theft or has detected the theft but decided to not revoke the certificate, because the theft was after the expiration of the certificate. The attacker now wants to ‘revive’ the certificate. For that, the attacker dates back the system time to January 9th 2020 and signs the VISX package with the expired certificate. Then the attacker timestamps the signature for January 9th 2020 with their self-created TSA, using a timestamp that contains the timestamping certificate. After that, the attacker distributes the VISX package to end users (e.g. by publishing it on a website, by performing Man-in-the-Middle (MitM) attacks on a file download, or by putting it in a shared directory). When an end user executes the VSIX package, the signature and timestamp are shown as valid and the VSIX package is subsequently installed without any warning.</p>
<p><strong>Scenario 2 (applying short-lived timestamps):</strong> The vulnerability further allows an attacker that can intercept and modify HTTP requests/responses (i.e. the attacker is MiTM) <strong>to respond to a software publisher’s legitimate timestamping request with a timestamp with a very short lifespan</strong>, without the software publisher being warned. As soon as the software publisher’s code-signing certificate expires, the signature would become invalid unexpectedly. This could heavily diminish the usage of the victim’s VSIX package, depending on the number of new installations. Timestamp requests are often sent via HTTP (see <a href="#9">[9]</a>), so Man-in-the-Middle attacks are not unlikely.</p>
<p>Imagine the following example: An attacker is MiTM in the communication between a software publisher and their TSA. The attacker has set up their own non-trustworthy TSA. Today is July 15th 2020. The code-signing certificate of the software publisher expires on July 20th 2020. The certificate of the TSA that the software developers trusts (i.e. the timestamping certificate) expires on January 1st 2030. To extend the validity period of a signature, the software publisher timestamps their signature with the TSA they trust. The attacker intercepts this call and responds with a timestamp that contains the timestamping certificate of their own non-trustworthy TSA. After signing and timestamping their VSIX package, the software publisher tests if the signing/timestamping worked properly by installing the VSIX package with the VSIX installer. The VSIX installer will show a valid signature and timestamp without any indication of the validity period of the signature or the timestamp. Consecutively, the software publisher starts publishing their software as usual on July 18th 2020. In the first days several hundred users download the VSIX package. On July 21st 2020, first users report that the signature on the VSIX package is invalid because the underlying certificate is expired. Now the software publisher has to quickly publish the VSIX package with a new code-signing certificate.</p>
<p><strong>Scenario 3 (dodged a bullet - ‘reviving’ revoked code-signing certificates):</strong> Our tests have shown that the VSIX installer only checks if a code-signing certificate was revoked, but not if it was timestamped before the revocation date. This means, the VSIX installer rejects all revoked certificates regardless of the timestamp (i.e. it does <strong>NOT keep signatures of revoked code-signing certificates valid iff the revocation date is after the timestamp date</strong>). As a result of the VSIX installer’s behavior, this vulnerability did not allow to ‘revive’ revoked code-signing certificates. If the VSIX installer allowed revoked code-signing certificates to stay valid iff the revocation date is after the timestamp date, an attacker could have <strong>‘revived’ revoked code-signing certificates</strong> (i.e. make revoked code-signing certificates valid again). This would have worked similar to the example of scenario 1. Instead of backdating the signature and timestamp to a date when the certificate was expired, an attacker would have backdated the signature and timestamp to a date when the certificate was not revoked. Needless to say that the possibility of ‘reviving’ revoked code-signing certificates would have constituted a critical vulnerability. Essentially, Microsoft unintentionally dodged a huge bullet by NOT <strong>keeping signatures of revoked code-signing certificates valid iff the revocation date is after the timestamp date</strong>.</p>
<h1 id="security-patch">Security Patch</h1>
<p>Microsoft has informed us that they were planning to fix the vulnerability with Visual Studio 16.3, which was released in Fall 2019. Old versions of Visual Studio will not be patched, and thus will remain vulnerable indefinitely. Unfortunately, the release notes of Visual Studio 16.3. did not mention the here-described vulnerability in any way <a href="#10">[10]</a>. However, in May 2020 we could confirm that the vulnerability is fixed at least in versions >=16.5.2047.</p>
<div class="sources">
<h1 id="sources">Sources</h1>
<ul>
<li>[<span id="1">1</span>] <a href="https://knowledge.digicert.com/generalinformation/INFO1119.html">https://knowledge.digicert.com/generalinformation/INFO1119.html</a></li>
<li>[<span id="2">2</span>] <a href="https://casecurity.org/wp-content/uploads/2013/10/CASC-Code-Signing.pdf">https://casecurity.org/wp-content/uploads/2013/10/CASC-Code-Signing.pdf</a></li>
<li>[<span id="3">3</span>] <a href="https://csrc.nist.gov/CSRC/media/Publications/white-paper/2018/01/26/security-considerations-for-code-signing/final/documents/security-considerations-for-code-signing.pdf">https://csrc.nist.gov/CSRC/media/Publications/white-paper/2018/01/26/security-considerations-for-code-signing/final/documents/security-considerations-for-code-signing.pdf</a></li>
<li>[<span id="4">4</span>] <a href="https://www.xolphin.com/support/signatures/Frequently_asked_questions/Timestamping">https://www.xolphin.com/support/signatures/Frequently_asked_questions/Timestamping</a></li>
<li>[<span id="5">5</span>] <a href="https://support.sectigo.com/ES_KnowledgeDetailPageFaq?Id=kA01N000000btid">https://support.sectigo.com/ES_KnowledgeDetailPageFaq?Id=kA01N000000btid</a></li>
<li>[<span id="6">6</span>] <a href="https://www.sede.fnmt.gob.es/documents/10445900/10577712/dpstq_english.pdf">https://www.sede.fnmt.gob.es/documents/10445900/10577712/dpstq_english.pdf</a></li>
<li>[<span id="7">7</span>] <a href="https://www.digicert-grid.com/DigiCert_CP_v403.pdf">https://www.digicert-grid.com/DigiCert_CP_v403.pdf</a></li>
<li>[<span id="8">8</span>] <a href="https://www.ietf.org/rfc/rfc3161.txt">https://www.ietf.org/rfc/rfc3161.txt</a></li>
<li>[<span id="9">9</span>] <a href="https://gist.github.com/Manouchehri/fd754e402d98430243455713efada710">https://gist.github.com/Manouchehri/fd754e402d98430243455713efada710</a></li>
<li>[<span id="10">10</span>] <a href="https://docs.microsoft.com/en-us/visualstudio/releases/2019/release-notes-v16.3">https://docs.microsoft.com/en-us/visualstudio/releases/2019/release-notes-v16.3</a></li>
</ul>
</div>]]></content><author><name>Daniel Ostovary</name></author><category term="blog" /><summary type="html"><![CDATA[Last year we discovered a vulnerability in the Visual Studio Extension (VSIX) installer, which comes with Microsoft Visual Studio. When verifying the signature of a VSIX package, the VSIX installer failed to check the trust of the timestamp under certain circumstances. This vulnerability allowed an attacker to apply a non-trustworthy timestamp without users being warned about it.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://about.fqa.test.signpath.io/2020-08-26-bg" /><media:content medium="image" url="https://about.fqa.test.signpath.io/2020-08-26-bg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Unfulfilled Expectations: Revoked Certificates in JAR Signing</title><link href="https://about.fqa.test.signpath.io/blog/2020/08/26/unfulfilled-expectations" rel="alternate" type="text/html" title="Unfulfilled Expectations: Revoked Certificates in JAR Signing" /><published>2020-08-26T00:00:00+00:00</published><updated>2020-08-26T00:00:00+00:00</updated><id>https://about.fqa.test.signpath.io/blog/2020/08/26/unfulfilled-expectations</id><content type="html" xml:base="https://about.fqa.test.signpath.io/blog/2020/08/26/unfulfilled-expectations"><![CDATA[<p>In April 2020 we became aware of a conceptual security issue in the Java JarSigner. The JarSigner does not check certificate revocations, which breaks JAR signing to some extent.</p>
<p>In this blog post we are going to talk about this issue. The blog post is written in cooperation with Marc Nimmerichter from <a href="https://www.impidio.com/blog/unfulfilled-expectations-revoked-certificates-in-jar-signing">Impidio</a>. We have reported this issue to Oracle shortly after its discovery. We will talk about our experiences with reporting this issue in a future blog post. To understand this issue, one has to understand Certificate Revocation Lists (CRLs) first.</p>
<h1 id="certificate-revocation-lists">Certificate Revocation Lists</h1>
<p>If the owner of a certificate wishes to revoke their certificate (i.e. invalidate it, for example, because of compromise), they can request the issuing Certificate Authority (CA) to put the certificate on a CRL (e.g. see <a href="#5">[5]</a>). The CRL distribution point is indicated in the CA’s certificate <a href="#5">[5]</a>. Often, a verifier of a signature checks the certificate’s CRL to see if it is revoked <a href="#6">[6]</a> (e.g. the Windows signature verification of an executable).</p>
<h1 id="the-security-issue">The Security Issue</h1>
<p>We found evidence that there was some CRL check for JAR signatures in the past (e.g. for Java Applets in JDK 7; see <a href="#3">[1-3]</a>). However, a source code analysis of the JarSigner of the JDK 12 and a review of its official documentation <a href="#4">[4]</a> show that CRLs are not automatically checked, neither by the JarSigner nor anywhere else in the JDK. Instead, as Oracle told us, developers are expected to call the PKIXRevocationChecker explicitly to check for revocations.</p>
<h1 id="the-impact">The Impact</h1>
<p>Since the JarSigner does not check CRLs, any stolen and revoked code-signing certificate can be used to sign JARs without the JarSigner warning users of a revoked certificate. That is unless users explicitely check revocation with the PKIXRevocationChecker. As verifiers of a signature often check the CRL of the certificate, users of the JarSigner almost certainly expect the JarSigner to do so too. These users rely on CRLs for security, but the JarSigner does not actually provide this level of security.</p>
<h1 id="addressing-the-security-issue">Addressing the Security Issue</h1>
<p>Shortly after discussing the issue with Oracle, they created a ticket to address this issue (<a href="https://bugs.openjdk.java.net/browse/JDK-8242060">JDK-8242060</a>). This ticket is expected to be resolved with JDK 15, which is planned to be released on September 15, 2020 <a href="#6">[6]</a>. Users of the JarSigner should note that this issue will not be addressed in older versions of the JDK (i.e. JDKs before JDK 15).</p>
<div class="sources">
<h1 id="sources">Sources</h1>
<ul>
<li>[<span id="1">1</span>] <a href="https://docs.oracle.com/javase/7/docs/technotes/tools/windows/jarsigner.html">https://docs.oracle.com/javase/7/docs/technotes/tools/windows/jarsigner.html</a></li>
<li>[<span id="2">2</span>] <a href="https://blogs.oracle.com/java-platform-group/signing-code-for-the-long-haul">https://blogs.oracle.com/java-platform-group/signing-code-for-the-long-haul</a></li>
<li>[<span id="3">3</span>] <a href="https://java.com/en/download/help/revocation_options.xml">https://java.com/en/download/help/revocation_options.xml</a></li>
<li>[<span id="4">4</span>] <a href="https://docs.oracle.com/en/java/javase/12/">https://docs.oracle.com/en/java/javase/12/</a></li>
<li>[<span id="5">5</span>] <a href="https://www.csoonline.com/article/2607448/revoke-certificates-when-you-need-to----the-right-way.html">https://www.csoonline.com/article/2607448/revoke-certificates-when-you-need-to—-the-right-way.html</a></li>
<li>[<span id="6">6</span>] <a href="https://openjdk.java.net/projects/jdk/15/">https://openjdk.java.net/projects/jdk/15/</a></li>
</ul>
</div>]]></content><author><name>Daniel Ostovary</name></author><category term="blog" /><summary type="html"><![CDATA[In April 2020 we became aware of a conceptual security issue in the Java JarSigner. The JarSigner does not check certificate revocations, which breaks JAR signing to some extent.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://about.fqa.test.signpath.io/2020-08-26-02-bg" /><media:content medium="image" url="https://about.fqa.test.signpath.io/2020-08-26-02-bg" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">A White Hat Story: Analysis of Secure Variables in AppVeyor</title><link href="https://about.fqa.test.signpath.io/blog/2019/12/13/an-analysis-of-secure-variables-in-appveyor" rel="alternate" type="text/html" title="A White Hat Story: Analysis of Secure Variables in AppVeyor" /><published>2019-12-13T09:38:20+00:00</published><updated>2019-12-13T09:38:20+00:00</updated><id>https://about.fqa.test.signpath.io/blog/2019/12/13/an-analysis-of-secure-variables-in-appveyor</id><content type="html" xml:base="https://about.fqa.test.signpath.io/blog/2019/12/13/an-analysis-of-secure-variables-in-appveyor"><![CDATA[<p>SignPath integrates with other systems, so we have to understand how they work and what the security attributes of certain features are. Any over-reliance on implicit or explicit security guarantees might effect the entire code signing process.</p>
<p>For this reason, we routinely analyze not only our own software and services, but also certain features of third party systems.</p>
<p>In our analysis of AppVeyor, we looked at the encryption of secrets, such as SignPath API tokens. Note that what we have discovered is not an exploitable security issue, but we thought the entire analysis still makes a good read for people interested in application security.</p>
<p>SignPath integrates with AppVeyor builds to verify the origin of build artifacts before they are signed. We recently investigated AppVeyor’s “secure variables” (aka “Encrypt YAML”) <a href="https://www.appveyor.com/docs/build-configuration#secure-variables">feature</a>. We discovered a few interesting things, which we describe in this blog post.</p>
<h1 id="appveyors-secure-variables">AppVeyors Secure Variables</h1>
<p>AppVeyor is a build software that is available on-premises or in the cloud. AppVeyor can be connected to a (public) source code repository, such as your GitHub repository, and be configured to automatically build a new version of the software whenever code in the repository is changed. The AppVeyor build configuration file (appveyor.yml) can also be checked in to a source code repository to make use of its collaboration and versioning features.</p>
<p>However, certain information in this appeyor.yml file might be sensitive (e.g. access credentials to other systems or services required during the build) and should not be visible to all developers (or to everyone in case of open source projects). The AppVeyor secure variables feature allows you to encrypt sensitive data before putting it into the appveyor.yml file. So only AppVeyor can decrypt this data to obtain the original value and use it during the build process. These encrypted values can e.g. be used in the appveyor.yml in configurations for environment variables or within webhooks.</p>
<p>The first thing to be aware of is that secure variables are only protected from users without write access to the appveyor.yml. <strong>Users with write access to this file can obtain the decrypted value</strong>. A user with write access can do this by e.g. just printing the secure variable to the build log (the secure variable is automatically decrypted by AppVeyor).</p>
<h2 id="secure-variable-encryption">Secure Variable Encryption</h2>
<p>We analysed properties of the encryption of secure variables.</p>
<p>We found out that <strong>the used cipher is a block cipher and has 128 bit (16 bytes) block size</strong>. The cipher type and the block size can be discovered by increasing the length of the plaintext. If the ciphertext length increases equally to the the plaintext length, the used cipher is a stream cipher. If the ciphertext length increases in blocks, then it is a block cipher. In the case of a block cipher, one can easily find out the block length by observing the increase in length of the ciphertext.</p>
<p>Furthermore, we found out that the used cipher mode is <strong>CBC</strong>. For that we performed the following test.</p>
<p>Our test used the following text (unencrypted) as secure variable:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Bearer SecretToken123456789012345678901234567890
</code></pre></div></div>
<p>The hexadecimal representation of this string is:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>42 65 61 72 65 72 20 53 65 63 72 65 74 54 6F 6B
65 6E 31 32 33 34 35 36 37 38 39 30 31 32 33 34
35 36 37 38 39 30 31 32 33 34 35 36 37 38 39 30
</code></pre></div></div>
<p>When using the Encrypt YAML feature for this text, AppVeyor returns the following Base64-encoded encrypted string:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>FVwaQuowgIbDmPo987vdYrmGEHeelN7l6ni8MTzouU3jtrE8/9w6tYSuq84DHJbXhMQKpxqfZOTRIH0g/flKjg==
</code></pre></div></div>
<p>After Base64 decoding, the hex representation is:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>15 5C 1A 42 EA 30 80 86 C3 98 FA 3D F3 BB DD 62
B9 86 10 77 9E 94 DE E5 EA 78 BC 31 3C E8 B9 4D
E3 B6 B1 3C FF DC 3A B5 84 AE AB CE 03 1C 96 D7
84 C4 0A A7 1A 9F 64 E4 D1 20 7D 20 FD F9 4A 8E
</code></pre></div></div>
<p>Here we can already see that our encrypted ciphertext has 16 bytes more than the plaintext. This is probably due to padding and allows us infer that the block size of the ciphertext is 16 bytes.</p>
<p>To find out more about block size and the mode of operation, we modified the following bytes of the ciphertext:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>15 5C 1A 42 EA 30 80 86 C3 98 FA 3D F3 BB DD 62
B9 86 <strong>11</strong> 77 9E 94 DE E5 EA 78 BC 31 3C E8 B9 4D
E3 B6 B1 3C FF DC 3A B5 84 AE AB CE 03 1C 96 D7
84 C4 0A A7 1A 9F 64 E4 D1 20 7D 20 FD F9 4A 8E
</code></pre></div></div>
<p>This ciphertext decrypted to the following text:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Bearer SecretTokL Q4▒u5i?y▒▒'15668901234567890
</code></pre></div></div>
<p>The hexadecimal representation of this string is:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>42 65 61 72 65 72 20 53 65 63 72 65 74 54 6F 6B
<strong>4C 09 51 34 FD 75 35 16 69 3F 79 FD FD 27 31 1C</strong>
35 <strong>36</strong> 36 38 39 30 31 32 33 34 35 36 37 38 39 30
</code></pre></div></div>
<p>We can see that a change of the third byte in the second ciphertext block changed the whole second plaintext block and the third byte of the next plaintext block. This perfectly matches the behavior of the cipher mode CBC (as displayed in the following figure).</p>
<p><img src="/assets/posts/2019-12-13-cbc.png" alt="CBC" /></p>
<p>Generally, in CBC a change in a byte of a ciphertext block at index X directly affects all bytes of the same ciphertext block and the byte at index X of the next ciphertext block. The observed behavior further confirms that the encryption algorithm is a block cipher with 16 bytes length.</p>
<p>Our test didn’t lead to an error, but decrypted normally. From that we can infer that <strong>the ciphertext is not integrity protected</strong>.</p>
<p>In many cases missing integrity protection has security implications. In combination with the CBC cipher mode, this can result in so called “Padding Oracle” attacks. These attacks exploit</p>
<ul>
<li>the lack of integrity protection and</li>
<li>some properties of CBC mode and PKCS#7 padding</li>
</ul>
<p>and allow attackers to fully or partially decrypt ciphertext! This means, AppVeyor is potentially vulnerable to Padding Oracle attacks. Notably, as laid out at the beginning of the post, anyone with write access to the appveyor.yml can decrypt values by design. Thus, a Padding Oracle would have no security implications in AppVeyor anyways.</p>
<p>For more curious readers, we briefly describe our analysis of a potential Padding Oracle in AppVeyor.</p>
<h2 id="padding-oracle-in-appveyor">Padding Oracle in AppVeyor</h2>
<p>For a Padding Oracle to exist, it must be possible to distinguish the following cases (e.g. by different error message, by timing, or other means) when decrypting ciphertext:</p>
<ul>
<li>A given ciphertext decrypts to a valid plaintext.</li>
<li>A given ciphertext decrypts to a malformed plaintext (e.g. with non-ASCII characters) and has valid padding.</li>
<li>A given ciphertext decrypts has invalid padding.</li>
</ul>
<p>This information, combined with the knowledge of the functionality of CBC and PKCS#7 and some basic logic, is enough for an attacker to decrypt most of a given ciphertext. To confirm the existence or non-existence of a potential Padding Oracle in AppVeyor, we performed the following tests.</p>
<p>We used a secure variable (encrypted Bearer <code class="language-plaintext highlighter-rouge">SecretToken123456789012345678901234567890</code> to let AppVeyor authenticate against our web server. On the web server we received:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>POST / HTTP/1.1
Authorization: Bearer SecretToken123456789012345678901234567890
Content-Type: application/json; charset=utf-8
Host: 54.69.243.156:8000
Content-Length: 8858
Connection: Keep-Alive
</code></pre></div></div>
<p>This matches the behavior of <strong>case 1.a</strong>.</p>
<p>We then modified the third byte of the second ciphertext block and again let AppVeyor against our web server. This time we received:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>POST / HTTP/1.1
Authorization: Bearer SecretTokL Q4▒u5i?y▒▒'15668901234567890
Content-Type: application/json; charset=utf-8
Host: 54.69.243.156:8000
Content-Length: 8860
Connection: Keep-Alive
</code></pre></div></div>
<p>This matches the behavior of <strong>case 1.b</strong>.</p>
<p>Lastly, we crafted ciphertext that causes bad padding. For that we changed the last byte of the third ciphertext block. Due to CBC mode, this change should not only scramble the third block, but also change the last byte of the fourth block, which is our padding. Since padding needs to have a specific structure in PKCS#7, this change should cause incorrect padding. We used the following ciphertext:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>15 5C 1A 42 EA 30 80 86 C3 98 FA 3D F3 BB DD 62
B9 86 10 77 9E 94 DE E5 EA 78 BC 31 3C E8 B9 4D
E3 B6 B1 3C FF DC 3A B5 84 AE AB CE 03 1C 96 <strong>00</strong>
84 C4 0A A7 1A 9F 64 E4 D1 20 7D 20 FD F9 4A 8E
</code></pre></div></div>
<p>With this ciphertext we let AppVeyor authenticate against our web server one last time. The result was:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>POST / HTTP/1.1
Content-Type: application/json; charset=utf-8
Host: 54.69.243.156:8000
Content-Length: 8840
Connection: Keep-Alive
</code></pre></div></div>
<p>This matches the behavior of <strong>case 1.c</strong>.</p>
<p>Can this be a problem? Yes it can be! This behavior exactly matches the definition of a Padding Oracle. An attacker could now continue this attack as described in <a href="https://robertheaton.com/2013/07/29/padding-oracle-attack/">this blog post</a> and could obtain most or all of the plaintext from the ciphertext. Particularly, if an attacker knows the initialization vector, he/she can obtain all of the plaintext. If an attacker doesn’t know the initialization vector, he/she could obtain the plaintext for the whole ciphertext except the first ciphertext block. However, in many cases the first ciphertext block can be guessed or obtained from the system.</p>
<p>Conclusively, in this article we showed:</p>
<ol>
<li>The functionality of AppVeyor’s secure variables.</li>
<li>The encryption used for AppVeyor’s secure variables.</li>
<li>How a small mistake in encryption could, in principle, lead to Padding Oracle attacks that allow the decryption of most or all of the ciphertext.</li>
</ol>
<p>This post was written together with Marc Nimmerichter from <a href="https://www.impidio.com/">Impidio</a>. All findings were responsibly disclosed to AppVeyor and only published when we agreed that they contain no exploitable information.</p>]]></content><author><name>Daniel Ostovary</name></author><category term="blog" /><summary type="html"><![CDATA[SignPath integrates with other systems, so we have to understand how they work and what the security attributes of certain features are. Any over-reliance on implicit or explicit security guarantees might effect the entire code signing process.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://about.fqa.test.signpath.io/2019-12-13-bg" /><media:content medium="image" url="https://about.fqa.test.signpath.io/2019-12-13-bg" xmlns:media="http://search.yahoo.com/mrss/" /></entry></feed>