Akamai CloudFront Certificate pinning

We recently took over a project that has some static angularJS code hosted in via standard Route 53 -> CloudFront -> S3. And domain CDN is using Akamai which has origin set to the R53 url. Akamai has a `Specific Certificates (pinning)` setting to pull certificate from a *.cloudfront.net. 

Over the weekend something odd happened, some of our users can access, some get Akamai error, some get 404. Turns out the reason is the CloudFront certificate expires and amazon signed a new one. However in our Akamai side, we still hold that old cert’s hash which is used for ssl validation. As a result the ssl connection cannot be established. To solve the issue we have to re-extract the hash for new certificate in Akamai console. And it worked in our QA. The time we are ready to apply this change  to prod, our prod suddenly worked… We believe Akamai also have some mechanism to periodically pull the site for new certificate chain, or the max-age of the old hash reached.

A final solution will be create a new cert with R53 domain name to avoid all these pin/expire issue.

Certificate Pinning is basically allow us to ignore the regular SSL certificate chain verification process and just/only trust the provided certs. For Akamai, it generates the SHA-1 fingerprint of the leaf cert in the chain and use that to compare on each SSL handshake.

This link show chrome/Symantec is working on phasing out Symantec issued certs( Certs signed with Dec 1st 2017 or later are still good).

Advertisements

npm dev dependency not installed

When running npm install, the dev dependencies are usually also get installed. However today, we have a build runs inside a docker container which requires a test result to be generated and we cannot get test run. Turns out the Karma packages in the dev dependencies are not installed.

The reason is the container in which npm is run has a NODE_ENV environment var set to production so npm will pick that up and skip the dev dependencies installation.

Other scenarios dev will not be installed is

1. the npm config production value is set to true. use below command to check.

npm config get production

2. npm is run with --production flag.

 

So to solve this, we just unset that var and set it back after npm install.


unset NODE_ENV

npm install

export NODE_ENV=production

exclude file from karma coverage report

We have some files that is not so testable (like animation related, or things like polyfills) to be excluded from the karma coverage report. With angular-cli looks like it is straightforward that we can include the below config:

 "test": {
    "codeCoverage": {
      "exclude": [
        "src/polyfills.ts",
        "**/test.ts"
      ]
    },
    "karma": {
      "config": "./karma.conf.js"
    }
  }

However since we are not using cli , we need to tweak ourselves. Found the PR that implements the above feature. Looks like we just need to push the files to be excluded to the webpack config’s module rules for istanbul-instrumenter-loader`. Something like:

module:{
 rule: [
  ...
        {
        enforce: 'post',
        test: /\.(js|ts)$/,
        loader: 'istanbul-instrumenter-loader',
        include: helpers.root('src'),
        exclude: [
          /\.(e2e|spec)\.ts$/,
          /node_modules/,
          // add files needs to be excluded from coverage report below
          `${helpers.root('src')}/services/svg-animations.service.ts`
        ]
      },
  ...
 ]
}

The helpers.root is just a help function get get the proper path.

function root(args) {
  const path = require('path');
  args = Array.prototype.slice.call(arguments, 0);
  const ROOT = path.resolve(__dirname, '..');
  return path.join.apply(path, [ROOT].concat(args));
}

In the coverageReporter of karma.conf.js, we can set a lot of useful things like threshold, dir(normalized with subdir), etc…

run scp in background

recently we need to migrate our church’s Drupal site from godaddy vps to aws. The php/mysql part is kind of straightforward. The last task is to transfer the video/audio files to EBS. Since the size is pretty big(30~GB), we do not want to download to local and upload again. So scp becomes the choice. The command will be something like

scp -r vpsUser@vpsDomain:/vps/site/path/files/audio /aws/ebs/files

But if the session terminates, it will be interrupted. So we need nohup to help.

nohup scp -r vpsUser@vpsDomain:/vps/site/path/files/audio /aws/ebs/files

however this way, it runs in foreground not ideal. we need to add & to the end.

nohup scp -r vpsUser@vpsDomain:/vps/site/path/files/audio /aws/ebs/files &

but now the problem is we have no way to input password and get the below error message:

nohup: ignoring input and appending output to `nohup.out'

this is just because the process is suspended to receive a user input for password.

Seems to be the dead end. Now the solution is still run the scp nohup in foreground so that we can input password, then use ctrl-z to suspend, and then use bg to send it to background. and if you want to bring it back, fg is the command to run. also use jobs command to check existing ones, or just use ps aux | grep scp.

execute async call sequentially

fix number of calls

Today, i need to figure out a way to execute a bunch of http call sequentially. The problem first looks simple if we have known number of calls so that we can easily put them in callback one by one or chain the promise.

call1(()=>{
  call2(()=>{
    call3(...);
  })
});

call1()
 .then(() => {
  return call2();
 }).then(()=>{
  return call3();
 })...

number of calls varies

However if what we get is an array with any number of calls, it becomes tricky. the above aways obviously does not work.

promise holder with IIFE

One way is to define a promise out of the for loop, and assign the promise returned by the previous promise to it.

let itemsToProcess = [...]
let promiseHolder = processItem(itemsToProcess[0]);
for (let i = 1; i < itemsToProcess.length; i++) {
    (function() {
        let thisItem = itemsToProcess[i];
        let getNewPromise = () => processItem(thisItem);
        promiseHolder = promiseHolder.then(res => {
            onReponse(res);
            return getNewPromise();
        });
    })(i)
}

Or use reduce function to rescue:

let itemsToProcess = [...]
itemsToProcess.reduce((lastPromise, item) => 
 lastPromise.then(() => 
  processItem(item)
 ), Promise.resolve()
)

async and await

a more elegant way is to use the async and await inside for loop which will auto block the execution and wait.

for (let item of itemsToProcess) {
    let rs = await processItem(browser, item);
    onReponse(rs);
}

Yes, the code is this simple. The introduction of async/await from node7.x+ is really useful. await suspends the current function evaluation, including all control structures.

Caveat with foreach/map

One caveat is be very careful use async in the forEach/map loop, it would not work as expected if what you want is serial.

itemsToProcess.forEach(async (item)=> {
    let rs = await processItem(browser, item);
    onReponse(rs);
})

Rather than waiting for execution one by one, all the async call will fire at the same time. forEach expects synchronous function. Think about it this way: an async function returns a promise and it’s the caller’s responsibility to do something with that promise. However, Array#forEach doesn’t do anything with the return value of the callback, it simply ignores it. All it does is call the callback for each element. The Promise.all([...myPromiseArray]) will behave similar because promise is hot that the time it is created, the xhr/event is already fired.

one way is to create a forEachAsync to help but basically use the same for loop underlying.

Array.prototype.forEachAsync = async function(cb){
    for(let x of this){
        await cb(x);
    }
}

A good video about async/await

base64 in shell

Used to go to some site when in need of en/de-code from/to base64. Turns out we can do that from the shell directly.

Encode

echo -n 'My TextTo Encode' | base64

The -n is to prevent the new line of the echo from being included in the result(by default echo outputs a trailing newline).

Decode

echo 'My Encoded TExt | base64 --decode.

we can also use -D(MacOS) or -d(Linux) instead of --decode for short.