npm dev dependency not installed

When running npm install, the dev dependencies are usually also get installed. However today, we have a build runs inside a docker container which requires a test result to be generated and we cannot get test run. Turns out the Karma packages in the dev dependencies are not installed.

The reason is the container in which npm is run has a NODE_ENV environment var set to production so npm will pick that up and skip the dev dependencies installation.

Other scenarios dev will not be installed is

1. the npm config production value is set to true. use below command to check.

npm config get production

2. npm is run with --production flag.


So to solve this, we just unset that var and set it back after npm install.

unset NODE_ENV

npm install

export NODE_ENV=production


exclude file from karma coverage report

We have some files that is not so testable (like animation related, or things like polyfills) to be excluded from the karma coverage report. With angular-cli looks like it is straightforward that we can include the below config:

 "test": {
    "codeCoverage": {
      "exclude": [
    "karma": {
      "config": "./karma.conf.js"

However since we are not using cli , we need to tweak ourselves. Found the PR that implements the above feature. Looks like we just need to push the files to be excluded to the webpack config’s module rules for istanbul-instrumenter-loader`. Something like:

 rule: [
        enforce: 'post',
        test: /\.(js|ts)$/,
        loader: 'istanbul-instrumenter-loader',
        include: helpers.root('src'),
        exclude: [
          // add files needs to be excluded from coverage report below

The helpers.root is just a help function get get the proper path.

function root(args) {
  const path = require('path');
  args =, 0);
  const ROOT = path.resolve(__dirname, '..');
  return path.join.apply(path, [ROOT].concat(args));

In the coverageReporter of karma.conf.js, we can set a lot of useful things like threshold, dir(normalized with subdir), etc…

run scp in background

recently we need to migrate our church’s Drupal site from godaddy vps to aws. The php/mysql part is kind of straightforward. The last task is to transfer the video/audio files to EBS. Since the size is pretty big(30~GB), we do not want to download to local and upload again. So scp becomes the choice. The command will be something like

scp -r vpsUser@vpsDomain:/vps/site/path/files/audio /aws/ebs/files

But if the session terminates, it will be interrupted. So we need nohup to help.

nohup scp -r vpsUser@vpsDomain:/vps/site/path/files/audio /aws/ebs/files

however this way, it runs in foreground not ideal. we need to add & to the end.

nohup scp -r vpsUser@vpsDomain:/vps/site/path/files/audio /aws/ebs/files &

but now the problem is we have no way to input password and get the below error message:

nohup: ignoring input and appending output to `nohup.out'

this is just because the process is suspended to receive a user input for password.

Seems to be the dead end. Now the solution is still run the scp nohup in foreground so that we can input password, then use ctrl-z to suspend, and then use bg to send it to background. and if you want to bring it back, fg is the command to run. also use jobs command to check existing ones, or just use ps aux | grep scp.

execute async call sequentially

fix number of calls

Today, i need to figure out a way to execute a bunch of http call sequentially. The problem first looks simple if we have known number of calls so that we can easily put them in callback one by one or chain the promise.


 .then(() => {
  return call2();
  return call3();

number of calls varies

However if what we get is an array with any number of calls, it becomes tricky. the above aways obviously does not work.

promise holder with IIFE

One way is to define a promise out of the for loop, and assign the promise returned by the previous promise to it.

let itemsToProcess = [...]
let promiseHolder = processItem(itemsToProcess[0]);
for (let i = 1; i < itemsToProcess.length; i++) {
    (function() {
        let thisItem = itemsToProcess[i];
        let getNewPromise = () => processItem(thisItem);
        promiseHolder = promiseHolder.then(res => {
            return getNewPromise();

Or use reduce function to rescue:

let itemsToProcess = [...]
itemsToProcess.reduce((lastPromise, item) => 
 lastPromise.then(() => 
 ), Promise.resolve()

async and await

a more elegant way is to use the async and await inside for loop which will auto block the execution and wait.

for (let item of itemsToProcess) {
    let rs = await processItem(browser, item);

Yes, the code is this simple. The introduction of async/await from node7.x+ is really useful. await suspends the current function evaluation, including all control structures.

Caveat with foreach/map

One caveat is be very careful use async in the forEach/map loop, it would not work as expected if what you want is serial.

itemsToProcess.forEach(async (item)=> {
    let rs = await processItem(browser, item);

Rather than waiting for execution one by one, all the async call will fire at the same time. forEach expects synchronous function. Think about it this way: an async function returns a promise and it’s the caller’s responsibility to do something with that promise. However, Array#forEach doesn’t do anything with the return value of the callback, it simply ignores it. All it does is call the callback for each element. The Promise.all([...myPromiseArray]) will behave similar because promise is hot that the time it is created, the xhr/event is already fired.

one way is to create a forEachAsync to help but basically use the same for loop underlying.

Array.prototype.forEachAsync = async function(cb){
    for(let x of this){
        await cb(x);

A good video about async/await

base64 in shell

Used to go to some site when in need of en/de-code from/to base64. Turns out we can do that from the shell directly.


echo -n 'My TextTo Encode' | base64

The -n is to prevent the new line of the echo from being included in the result(by default echo outputs a trailing newline).


echo 'My Encoded TExt | base64 --decode.

we can also use -D(MacOS) or -d(Linux) instead of --decode for short.

MacOS bluetooth headset audio issue

Recently my bluetooth headset starts to act up when listening to music from MacOS. I initially thought it has died. Turns out it is an OS problem.

Basically if i use it as both input and output device, the sound quality is awful. So one solution is go to the sound setting and manually select input device internal microphone. Then looks like the audio will be back. However if you re-connect the device, it defaults to the bluetooth device as input again. quite annoying.

Another possible solution I found is to adjust the setting from the bluetooth explorer which can be downloaded from the Apple developer site‘s Hardware IO Tools for Xcode section. The downloaded the dmg is a bundle of various debugging tools. Open Bluetooth Explorer and go to Tools -> Audio Options -> force use aptX, and check the box. This will also improve the sound quality.