pipe operator in rxjs

Pipe is introduced so that we can combine any number of operators.

const source$ = Observable.range(0, 10);
source$
  .filter(x => x % 2)
  .reduce((acc, next) => acc + next, 0)
  .map(value => value * 2)
  .subscribe(x => console.log(x));

Above can be converted to:

const source$ = Observable.range(0, 10);
source$.pipe(
  filter(x => x % 2),
  reduce((acc, next) => acc + next, 0),
  map(value => value * 2)
).subscribe(x => console.log(x));

Pros are:

“Problems with the patched operators for dot-chaining are:

  1. Any library that imports a patch operator will augment the Observable.prototype for all consumers of that library, creating blind dependencies. If the library removes their usage, they unknowingly break everyone else. With pipeables, you have to import the operators you need into each file you use them in.
  2. Operators patched directly onto the prototype are not “tree-shakeable” by tools like rollup or webpack. Pipeable operators will be as they are just functions pulled in from modules directly.
  3. Unused operators that are being imported in apps cannot be detected reliably by any sort of build tooling or lint rule. That means that you might import scan, but stop using it, and it’s still being added to your output bundle. With pipeable operators, if you’re not using it, a lint rule can pick it up for you.
  4. Functional composition is awesome. Building your own custom operators becomes much, much easier, and now they work and look just like all other operators from rxjs. You don’t need to extend Observable or override lift anymore.”
We can also combine and make use of a single operator:
import { Observable, pipe } from 'rxjs/Rx';
import { filter, map, reduce } from 'rxjs/operators';

const filterOutEvens = filter(x => x % 2);
const sum = reduce((acc, next) => acc + next, 0);
const doubleBy = x => map(value => value * x);

const complicatedLogic = pipe(
  filterOutEvens,
  doubleBy(2),
  sum
);

const source$ = Observable.range(0, 10);

source$.let(complicatedLogic).subscribe(x => console.log(x)); // 50
For tap operator, we basically can do operation/logic with side effect in it and it
would return the original observable without affected by all the modification.
From HERE
Advertisements

unsubscribe in rxjs(angular 2+)

background

In reactive world(rxjs/ng2+), it is common and convenient to just create some subject/observable and subscribe to them for event handling etc. It is like the gof observer pattern​ out of the box.

issue

One caveat we recently have is, we call subscribe() of some subjects from our service in our ngOnInit or ngAfterViewInit funtions we forget to unsubscribe the subscriptions in our component. The consequence is each time the component is recreated during route change, one more subscription will be added to the subject, this is pretty bad if we are doing something heavy in the callback or even worse making some http call.

solution 1 – unsubscribe in ngOnDestroy

One solution is to call keep a reference of the subscription which is return by the subscribe function and then call its unsubscribe() function in the angular’s ngOnDestroy() lifecycle hook. It would work and is fine if there are only a few of them. If there are many and need to be called on each related component, it would be quite tedious.

Solution 2 – custom decorator calling ngOnDestroy

Another solution is to write a custom decorator which will provide logic for ngOnDestory. And the component itself still need to keep a list of subscriptions.

Solution 3 – use takeUntil operator

This way is to use a global subject to tell all subscription to stop taking values once it emit a value. It is more declarative IMHO.

import { OnDestroy } from '@angular/core';
import { Subject } from 'rxjs/Subject';

/**
 * extend this class if component has subscription need to be unsubscribed on destroy.
 *
 * example: myObservable.takeUntil(this.destroyed$).subscribe(...);
 */
export abstract class UnsubscribableComponent implements OnDestroy {
  // the subject used to notify end subscription(usually with `takeUntil` operator).
  protected destroyed$: Subject = new Subject();

  protected constructor() {}

  ngOnDestroy(): void {
    this.destroyed$.next(true);
    this.destroyed$.complete();
  }
}

So in the component it can be something like:

export class MyOwnComponent extends UnsubscribableComponent implements OnInit {
  ngOnInit() {
    this._eventsContainerService.allEventsInfo
      .takeUntil(this.destroyed$)
      .subscribe(result => {
        if (result) {
          this.handleAllEventsLoad(result);
        }
      });
}

 

sessionStorage/localStorage scope

Firstly, localStorage and sessionStorage are 2 objects on the window object. They tie to the origin of the current window.

As a result they are bind to :

  1. protocol, http/https are different
  2. domain
    1. subdomain can share with parent by manually setting document.domain.
    2. xxx.capitalone.com cannot share with yyy.capitalone.com
  3. port

Same thing apply to 302 redirect. The session/local storage value set on a page is not available on the page after redirect as long as they are different origin, even if they are in the SAME tab/window.

It can also be understood as per application based, as their values can be viewed in the dev-tool’s Application Tab.

 

WHATWG spec

MDN link

debug typescript mocha and server in vscode

We recently are developing a graphql api using apollo server and Typeorm on top of Aws Lambda. Code-wise is kind of straightforward with schema defined, then resolvers then service layer then dao layer then model defined typeorm with its annotations/decorators. However there are 2 issues related to debugging, unit test and run graphql local.

unit test

For unit test, our ci/cd pipeline uses nyc/mocha as runner. Those are good for running all test suites and generating reports on coverages etc. However when it comes to debugging we need to go to ide. And as we are using typescript, there is one more layer of transpile rather than vanilla es5/6 which is this a bit more complicated.

Good news is vscode comes with a powerful built-in node debugger, with the blow config, we can just open a ts​ file with mocha tests, set break point and start debug,

{
  "name": "TS Mocha Tests File",
  "type": "node",
  "request": "launch",
  "program": "${workspaceRoot}/node_modules/mocha/bin/_mocha",
  "args": ["-r", "ts-node/register", "${relativeFile}"],
  "cwd": "${workspaceRoot}",
  "protocol": "inspector",
  "env": { "TS_NODE_PROJECT": "${workspaceRoot}/tsconfig.json"}
}
  • Sets up a node task, that launches mocha
  • Passes a -r argument, that tells mocha to require ts-node
  • Passes in the currently open file – ${relativeFile}
  • Sets the working directory to the project root – ${workspaceRoot}
  • Sets the node debug protocol to V8 Inspector mode
  • The last TS_NODE_PROJECT I have to set it as I am using Typeorm which uses annotation/decorator which requires emitDecoratorMetadata set to true which is not default.

Local Run with nodemon

Another issue is as we are using aws lambda, it is not easy to run our graphql server locally.
Need to set up a local Koa server with the schema that the Apollo lambda also uses. This way we can access our the graphiql service from the  localhost:8080/graphiql​.

import 'reflect-metadata';
import * as Koa from 'koa';
import { initDatabase } from '../../dao/data-source';
import * as Router from 'koa-router';
import * as koaBody from 'koa-bodyparser';
import {
    graphqlKoa,
    graphiqlKoa,
} from 'apollo-server-koa';
import { schema } from '../../gq-schema';
import { localConf } from '../../config/config';

export const routes = new Router();

// API entrypoint
const apiEntrypointPath = '/graphql';
const graphQlOpts = graphqlKoa({
    schema,
    context: {msg: 'hello context'}
});

// routes.get(apiEntrypointPath, graphQlOpts);
routes.post(apiEntrypointPath, koaBody(), graphQlOpts);

// GraphiQL entrypoint
routes.get('/graphiql', graphiqlKoa({ endpointURL: apiEntrypointPath }));

(async () => {
  initDatabase(localConf);
  const app = new Koa();
  app.use(routes.routes())
    .use(routes.allowedMethods())
    .listen(8080);
})();

Now we can have nodemon run this server so every time we make any code change, the server will reload with the new content. Put below content in the nodemon.json in the project root.

{
  "watch": ["./src"],
  "ext": "ts",
  "exec": "ts-node --inspect=0.0.0.0:9229 ./path/to/above/server.ts"
}

Notice we run ts-node with 9229 port flag which is the default debug port for chrome so that we can later do debug in the chrome’s built-in node-debugger which is a green cube icon in the chrome’s dev tool console.

Now we can run local server by adding command into package.json:

"local": "npm run build && nodemon"

Then run npm run local OR ​yarn local.

Option2 debug server with vscode

To debug the above  server with vscode, we need to add some config into the launch.json.

    {
      "name": "Local Graphql Server",
      "type": "node",
      "request": "launch",
      "args": [
        "${workspaceRoot}/path/to/above/server.ts"
      ],
      "runtimeArgs": [
        "--nolazy",
        "-r",
        "ts-node/register"
      ],
      "sourceMaps": true,
      "cwd": "${workspaceRoot}",
      "protocol": "inspector",
    },

  • Sets up a node task that starts the currently open file in VS Code (the${relativeFile} variable contains the currently open file)
  • Passes in the --nolazy arg for node, which tells v8 to compile your code ahead of time, so that breakpoints work correctly
  • Passes in -r ts-node/register for node, which ensures that ts-node is loaded before it tries to execute your code
  • Sets the working directory to the project root – ${workspaceRoot}
  • Sets the node debug protocol to V8 Inspector mode (see above)

Now we can set break point in vscode and start debugging.

PS: No Enum in x.d.ts

One thing I notice today is in the xxx.d.ts file which is the module definition file, never define thing like Enum inside as this file is used for type/interface definition only and the content will NOT compile to js hence will not available in the run time. So if you defined anything like enum here it will compile fine but when you run the application, as long as you use these enums​, you get runtime error.

One alternative solution is to use custom type and define the list of strings:

export type MessageLevel = "Unknown" | "Fatal" | "Critical" | "Error";

aws cli ProfileNotFound

I was trying to do some KMS encryption for some of our prod credentials with aws cli. After pulling down the temporary aws sts token for prod roles and run the aws --profile SOME_PROD_ROLE kms encrypt xxx, the  botocore.exceptions.ProfileNotFound: The config profile (SOME_DEV_ROLE) could not be found constantly pop up.

I checked the ~/.aws/credentails file and make sure the [default] block is the one that i need. Still getting that. So looks like somewhere else is setting the cli to use that SOME_DEV_ROLE.

It turns out while I was using some other cli tool, the AWS_PROFILE was set on my environment so the cli will try to locate that profile. Even if another profile is explicitly set with --profile, it will still make sure that profile exist, otherwise error out. This is not ideal and should be considered a bug in aws cli IMHO.

So after unset that AWS_PROFILE var, everything works again.

kms quick bash cmd on MacOS:

echo "Decrypted: $(aws kms decrypt --ciphertext-blob fileb://<(echo $ENCRYPTED_DATA | base64 -D) --query Plaintext --output text | base64 -D)"

href # does not always do nothing

Today we were facing an issue that the angular universal generates an anchor element with href="false" so before the js is loaded, the anchor would lead user to /false.

My first thought is to just put a href="#' so that it does nothing. Our url is something like http://www.c1.com/local. After adding the #, the click always navigate to uri: http://www.c1.com/#. It turns out we have a base tag defined in our html: <base href="/" /> which changes the behavior of the #.

One solution is to use <a href="javascript:void(0);"> , which will make the anchor do nothing. However one drawback is if we click the anchor before the angular finish bootstrapping, the anchor’s directive cannot be loaded so it remains doing nothing forever…

Eventually what we did is adding a onclick="return false;" to the anchor and it will be removed after the directive comes in and replace the behavior. This ways we make sure the anchor does not do anything before all js load, also the js context work as expected even if the anchor is clicked before it finishes loading.

npm git private repo auth

Our enterprise Github recently made a change so all anonymous access from work laptop are blocked. This causes problem as we have some private repo dependencies in our package.json like below:

dependencies {
 ...
 "sun": "git+https://github.kdc.capitalone.com/NeX/sun-ng2-nex.git",
 ...
}

Whenever I try to do npm install, it gives me authentication error. I can get it installed by add username password to the github url, however that would expose my plain corporate password.

After some research, it turns out  Git uses curl under the hood, you can use ~/.netrc file with the credentials. For GitHub it would look something like this:

machine github.com
  login <github username>
  password <password OR github access token>

This way, I can just have a github personal access token sitting in my .netrc file and it will always work. It is better than password as the corporate password get expired every 60/90 days and then the file need to be updated every time that changes.