work with multiple files in vim

When working with multi files in vim, there are several options.

Netrw

Was first trying to use the netrw which comes with vim installation. However it is not quite easy to use. :e. is to open the file explore, or :Exp, or : Vex, or :Sex.

NERDTree

Nerdtree is the de facto file explore. After the frustration with netrw, i decided to give it a a try and it has so many github stars for some reasons. Unlike a lot of custom config in netrw, most nerdtree config is out of box.

  • s to open a file in a file in a vertical spli
  • I to show hidden files in explorer
  • m to open menu with options to do CRUD.
  • Tabs also works nice with t open in new tab and gt or #NUMBERgt to navigate between them. and use ctrl + w to toggle focus.
  • ? to show the available shortcuts.

Vim plug

vim-plug is a new vim plugin management tool which seems to be better than others like vbundle. it allows lazy loading plugins which is awesome. The config is also easy, just add the desired plugin into .vimrc and run :PlugInstall. To remove plugin, delete the line in .vimrc and run :PluginClear.

super tab

super tab is nice plugin for word/code completion. it leverages the vim popup for completion and the index is also smart.

terminal in vim

if you have vim 8, then :term is built in. (`:echo has(‘terminal’)` will output “1”.). Same as tabs in NERDTree, use `ctrl+w` to navigate between terminal and other tabs.

in macos use brew install vim --override-system-vi to upgrade to vim 8.

Advertisements

webpack custom plugin

Recently we work with a platform which need to use webpack to build some ng2/4 assets and also some custom steps to pull data from a headless cms(via gulp) and eventually render components. One problem here is we cannot do live reload/recompile that every time we make some change we have to run the npm command again to compile the resources.

To solve the issue, i decided to write a custom webpack plugin to make browser-sync and webpack work together.

The basic flow is 1. run webpack in watch mode so every time a resouce(ts/css/html) changes, webpack auto re-compile, 2. serve the resources via browser-sync, here browserSync just serve as a mini express server and provide browser reload capability. 3. a webpack plugin to start the browser-sync server and register the reload event when webpack compilation is done.

Plugin

The webpack api is pretty straightforward, it exposes a compile object from the plugin’s apply function. It represents the fully configured Webpack environment. This object is built once upon starting Webpack, and is configured with all operational settings including options, loaders, and plugins. When applying a plugin to the Webpack environment, the plugin will receive a reference to this compiler. Use the compiler to access the main Webpack environment.

const browserSync = require('browser-sync');

function WebpackAndromedaPlugin(options) {
  console.log(`WebpackAndromedaPlugin options: ${JSON.stringify(options, null, 2)}`);

  let browserSyncConfig = require('./lib/dev-server').getBrowserSyncConfig(options);
  browserSyncConfig.server.middleware.push(require('./lib/dev-server').makeServeOnDemandMiddleware(options));
  browserSync.init(browserSyncConfig);
}

WebpackAndromedaPlugin.prototype.apply = (compiler) => {

  compiler.plugin("compile", function (params) {
    console.log('--->>> andromeda started compiling');
  });

  compiler.plugin('after-emit', (_compilation, callback) => {
    console.log('--->>> files prepared');
    browserSync.reload();
    callback();
  });
}

as we can see above, we can register our callbacks with compiler.pulgin(), where webpack exposes different stages for us to interact.

Another important object is compilation, which  represents a single build of versioned assets. While running Webpack development middleware, a new compilation will be created each time a file change is detected, thus generating a new set of compiled assets. A compilation surfaces information about the present state of module resources, compiled assets, changed files, and watched dependencies. The compilation also provides many callback points at which a plugin may choose to perform custom actions.

For example, all the generated files will be in compilation.assets object.

webpack-dev-middleware

The webpack-dev-middleware is a nice little express middleware that serves the files emitted from webpack over a connect server. One good feature it has is serving files from memory since it uses a in memory file system, which exposes some simple methods to read/write/check-existence in its MemoryFileSystem.js.  The webpack-dev-middleware also exposes some hooks like close/waitUntilValid etc, unfortunately the callback that waitUntilValid registers will only be called once according to the compileDone function here. Anyway, it is still an efficient tool to serve webpack resources and very easy to integrate with the webpack nodejs APIs:

~function() {
  const options = require('./config');
  let  webpackMiddleware = require("webpack-dev-middleware");

  let webpack = require('webpack');
  let browserSyncConfig = getBrowserSyncConfig(options);
  browserSyncConfig.server.middleware.push(makeServeOnDemandMiddleware(options));
  const compiler = webpack(require('./webpack.dev'));
  compiler.plugin('done', ()=>browserSync.reload())
  let inMemoryServer = webpackMiddleware(compiler, {noInfo: true, publicPath:'/assets'});
  browserSyncConfig.server.middleware.push(inMemoryServer);
  browserSync.init(browserSyncConfig);
}();

 

webpack-dev-server

The webpack-dev-server is basically a wrapper over the above webpack-dev-middleware. it is good for simple resources serving since it does not expose much. I was trying to find a hook to it to intercept the resource it generates/serves but did not get a good solution. If you need more customization, it would be better to go with webpack-dev-middleware.

a very detailed webpack intro article

exclude xml-apis dependency in maven

The xml-apis version 1.0xxx is referenced by multiple hibernate artifacts like hibernate-core, hibernate-entitymanager etc. It is annoying because it would conflict with the JRE’s own javax.xml api in the rt.jar which would cause problems like xml cannot parse.

We need to be extra careful about it since it usually is introduced thru transitive dependencies. Definitely manual exclude it from the known artifacts would help, but we can have a maven plugin to check for us: maven-enforcer-plugin

If we have similar config as below and have the xml-apis.jar in the dependency tree, the build will fail.

         <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-enforcer-plugin</artifactId>
            <version>1.4.1</version>
            <executions>
               <execution>
                  <id>enforce-banned-dependencies</id>
                  <goals>
                     <goal>enforce</goal>
                  </goals>
                  <configuration>
                     <rules>
                        <bannedDependencies>
                           <excludes>
                              <!--this is to check we do not have the xml-apis included since JRE provides it already-->
                              <exclude>xml-apis:xml-apis</exclude>
                           </excludes>
                        </bannedDependencies>
                     </rules>
                     <fail>true</fail>
                  </configuration>
               </execution>
            </executions>
         </plugin>

Some other interesting discussion on this about maven’s nearest win policy.

maven build phase plugine(clover duplicate class)

When using clover2 in CI build, it might throw duplicate class error if using: mvn clover2:setup test clover2:aggregate clover2:clover.

The reason is the clover2:setup is by default in the generate-source phase will copy the src to target/clover/src-instrumented and the test will also run the generate source and compile. Now there end up to be 2 source and duplicate class error is thrown.

One solution is to bind the clover2:setup to the process-source phase.

<executions>
        <execution>
            <id>instrument</id>
            <phase>process-sources</phase>
            <goals>
                <goal>setup</goal>
            </goals>
        </execution>
    </executions>

More about maven phase and plugin

Maven helps you build a project. The way it does that is through the build lifecycle and theplugins.

The lifecycle is made of phases that you can call explicitly on the command line, for example:

mvn package

package is a phase part of the default build lifecycle, like compile ordeploy. All the phases of the default lifecycle can be found in thereference. At each phase, Maven calls a goal in a plugin that does something for you. For example, the maven-compiler-plugin has acompile goal that compiles your java code during the compile phase of the lifecycle. You can also explicitly call a plugin on the command line, like:

mvn clean:clean

which calls the clean goal on the maven-clean-plugin. By default this goal is bound to the clean phase, so you can call it by executing mvn clean. You can call any plugin by using its group id, its artifact id, its version and the goal you want to execute, e.g.:

mvn org.codehaus.mojo:versions-maven-plugin:2.1:set -DnewVersion=1.2.3

(A plugin named blah-maven-plugin can be called by the shortened version of its name, blah. See also the Guide to Developing Java Plugins)

To “bind” a plugin goal to a phase, you just need to define your plugin in your pom.xml, and define its execution to a phase of the lifecycle:

      <plugin>
        <groupId>org.mirah.maven</groupId>
        <artifactId>maven-mirah-plugin</artifactId>
        <version>1.1-SNAPSHOT</version>
        <executions>
          <execution>
            <phase>compile</phase>
            <goals><goal>compile</goal></goals>
          </execution>
        </executions>
      </plugin>

Here we are attaching the mirah compile goal to the compile phase of the lifecycle. When Maven executes the compile phase, it will the compile the Mirah code for us.

Subversion with Eclipse using proxy

For Subclipse plugin proxy setting  in eclipse

You need to open

C:\Documents and Settings\<user_name>\Application Data\Subversion\servers

 

Note: for windows 7 this could also be found in

c:\Users\<user_name>\AppData\<Roaming\Local>\Subversion\servers

In this file you need to uncomment few line under global

[global]

# http-proxy-host=proxy1.some-domain-name.com
# http-proxy-port=80
# http-proxy-username=blah
# http-proxy-password=doubleblah
# http-timeout=60