Monday, February 1, 2016

NodeJs Part 2




https://github.com/Jam3/devtool
Runs Node.js programs inside Chrome DevTools (using Electron).
# runs a Node.js app in DevTools
devtool src/app.js
This allows you to profile, debug and develop typical Node.js programs with some of the features of Chrome DevTools. See my blog post Debugging Node.js With Chrome DevTools for more details.
npm install -g devtooldevtool app.js
# pipe in content to process.stdin devtool < audio.mp3 # pipe in JavaScript to eval it browserify index.js | devtool

http://javascriptplayground.com/blog/2015/03/node-command-line-tool/

  • remove the main entry: this is only used for modules that will be used through the module system (eg var _ = require('underscore');).
  • add preferGlobal and set it to true, which means if someone installs this module through npm and doesn't use the --global option, they will be warned that the module is designed to be installed globally.
  • adding the bin object, which maps commands to files. This means when this module is installed, npm will set up the filesearch executable to executeindex.js.
Now in your project you can run npm link to install the script on your system. This creates a symmlink from your project so that you can run the project whilst working on it, with no need to keep reinstalling it over and over again.
console.log(process.argv);
The first argument is always node, and the second is the path to the file that has been executed. Any following arguments are ones that the user has called your script with, and those are the ones we care about. We can use slice to get an array of just the arguments we need:

var userArgs = process.argv.slice(2);

var searchPattern = userArgs[0];
To run a command in the system we can use the exec method of the child_process module - a module that comes with Node and doesn't need to be separately installed - to execute the right command, passing in the search pattern the user passed in through to grep:

var exec = require('child_process').exec;
var child = exec('ls -a | grep ' + searchPattern, function(err, stdout, stderr) {
  console.log(stdout);

});
The first argument is always node, and the second is the path to the file that has been executed. Any following arguments are ones that the user has called your script with, and those are the ones we care about. We can use slice to get an array of just the arguments we need:

var userArgs = process.argv.slice(2);
var searchPattern = userArgs[0];
To run a command in the system we can use the exec method of the child_process module - a module that comes with Node and doesn't need to be separately installed - to execute the right command, passing in the search pattern the user passed in through to grep:

var exec = require('child_process').exec;
var child = exec('ls -a | grep ' + searchPattern, function(err, stdout, stderr) {
  console.log(stdout);
});
npm publish
npm adduser
npm install --global filesearch

https://developer.atlassian.com/blog/2015/11/scripting-with-node/

Parsing command line options

https://bitbucket.org/tpettersen/bitbucket-snippet/src/master/index.js?fileviewer=file-view-default

npm install --save commander
+ program
+  .arguments('<file>')
+  .option('-u, --username <username>', 'The user to authenticate as')
+  .option('-p, --password <password>', 'The user\'s password')
+  .action(function(file) {
+    console.log('user: %s pass: %s file: %s',
+        program.username, program.password, file);
+  })
+  .parse(process.argv);

Prompting for user input
npm install --save co co-prompt
Coloring terminal output
npm install --save chalk
npm install --save progress

http://shapeshed.com/command-line-utilities-with-nodejs/
Piping data
You receive piped data in a Node.js shell script like this

process.stdin.resume();
process.stdin.setEncoding('utf8');
process.stdin.on('data', function(data) {
  process.stdout.write(data);
});

echo 'foo' | ./yourscript
UNIX signals
process.stdin.resume();
process.on('SIGINT', function () {
  console.log('Got a SIGINT. Goodbye cruel world');
  process.exit(0);
});

http://modernweb.com/2014/12/08/building-first-node-js-module/
a Node.js module is a JavaScript file which is built to follow the CommonJS module specification. It encapsulates related functionality into a reusable file that can be used across multiple projects.
Modules are loaded by using “require” with which they can be loaded directly using a file path, or in the case where the module is located in the “node_modules” folder of your project, they can simply be loaded using the module name.
module.exports = function (numbers) {
var value = 0;
for (var i = 0; i < numbers.length; i++) {
value = value + numbers[i];
}
return value;
};
Testing our Node.js Module
npm install jasmine-node -g
var AddAll = require('../index.js');
We can now describe our test suite as a series of tests, the first step in doing this is for us to run the “describe” method. The “describe” method takes two arguments; the first being the name of the test suite we are describing, and the second is a function which is used to define the tests that make up the test suite.

The next step is for us to add tests to our test suite, for this example we will write three tests each passing a different array of values. 
describe("AddAll Suite", function() {
it("should respond with a value of 6", function(done) {
var value = AddAll([1,2,3]);
expect(value).toEqual(6);
done();
});
it("should respond with a value of 10", function(done) {
var value = AddAll([1,2,3,4]);
expect(value).toEqual(10);
done();
});
it("should respond with a value of -1", function(done) {
var value = AddAll([-10,2,3,4]);
expect(value).toEqual(-1);
done();
});
});

npm install jasmine-node -g
jasmine-node spec/
npm install AddAll

http://blog.nodejitsu.com/six-nodejs-cli-apps/
npm install -g http-server
ngist

http://blog.modulus.io/nodejs-scripts
http://blog.modulus.io/top-10-reasons-to-use-node

Node.js in Practice
https://github.com/alexyoung/nodeinpractice
process.stdin.resume();
process.stdin.setEncoding('utf8');

process.stdin.on('data', function(text) {
  process.stdout.write(text.toUpperCase());
});
https://github.com/alexyoung/nodeinpractice
A key global object is process, which is used to communicate with the operating system.
Another important global is the Buffer class. This is included because JavaScript has traditionally lacked support for binary data.

Some globals are a separate instance for each module. For example, module is available in every Node program, but is local to the current module. Since Node programs may consist of several modules, that means a given program has several different module objects—they behave like globals, but are in module scope.

/usr/local/lib/node_modules
npm search /^express$/
npmsearch

require in Node returns an object rather than loading code into the current namespace, as would occur with a C preprocessor.
modules.exports
exports.xx=
delete require.cache[require.resolve('./myclass')];

Loading a group of related modules
Node can treat directories as modules, offering opportunities for logically grouping related modules together.

Node’s module system supports this by allowing directories to act as modules. The easiest way to do this is to create a file called index.js that has a require statement to load each file.

Adding a package.json file to a directory can help the module system figure out how to load all of the files in the directory at once. The JSON file should include a main property to point to a JavaScript file. This is actually the default file Node looks for when loading modules—if no package.json is present, it’ll then look for index.js.

Use __dirname or __filename to determine the location of the file.

console.trace()
Use console.time() and console.timeEnd().
Use the process.arch and process.platform properties. - process.memoryUsage() - process.argv/pid
require('fs').createReadStream(file).pipe(process.stdout);
process.exit() - $? - The Windows equivalent is %errorlevel%.

The process object is an EventEmitter, which means you can add event listeners to it.
Before doing anything with standard input, resume should be called  to prevent Node from exiting straight away.
process.kill(pid, [signal])
process.on('SIGHUP', function () {
  console.log('Reloading configuration...');
});
kill -HUP pid

new Buffer('string', 'base64')
var encoded = Buffer(user + ':' + pass).toString('base64');
data:[MIME-type][;charset=<encoding>[;base64],<data>
var encoding = 'base64';
var data = fs.readFileSync('./monkey.png').toString(encoding);

var uri = 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAACsAAAAo...';
var data = uri.split(',')[1];
var buf = Buffer(data, 'base64');
fs.writeFileSync('./secondmonkey.png', buf);

Node.js the Right Way
How Node Applications Work
const fs = require( 'fs' );
fs.watch( 'target.txt' , function () {
  console.log( "File 'target.txt' just changed!" );
});
"use strict" ;
const
 fs = require( 'fs' ),
 spawn = require( 'child_process' ).spawn, //We’re only interested in the spawn() method
 filename = process.argv[2];
if (!filename) {
  throw Error( "A file to watch must be specified!" );
}
fs.watch(filename, function () {
  let ls = spawn( 'ls' , [ '-lh' , filename]);
 ls.stdout.pipe(process.stdout);
});
Strict mode is also required to use certain ECMAScript Harmony features in Node, such as the let keyword.
--harmony

EventEmitter provides a channel for events to be dispatched and listeners notified.
fs.watch(filename, function () {
  let
   ls = spawn( 'ls' , [ '-lh' , filename]),
   output = '' ;
 ls.stdout.on( 'data' , function (chunk){
   output += chunk.toString();
 });
 ls.on( 'close' , function (){
    let parts = output.split(/ \s +/);
   console.dir([parts[0], parts[4], parts[8]]);
 });
});
fs.readFile( 'target.txt' , function (err, data) {
  if (err) {
    throw err;
 }
 console.log(data.toString());
});
fs.writeFile( 'target.txt' , 'a witty message' , function (err) {
  if (err) {
    throw err;
 }
});
#!/usr/bin/env node --harmony
require( 'fs' ).createReadStream(process.argv[2]).pipe(process.stdout);
stream = fs.createReadStream(process.argv[2]);
stream.on( 'data' , function (chunk) {
process.stdout.write(chunk);
});
When working with an EventEmitter, the way to handle errors is to listen for error events.
If you don’t listen for error events, but one happens anyway, Node will throw an exception. And as we saw before, an uncaught exception will cause the process to terminate.
data = fs.readFileSync( 'target.txt' );

 net = require( 'net' ),
 filename = process.argv[2],
 server = net.createServer( function (connection) {
   connection.write( "Now watching '" + filename + "' for changes...\n" );
    let watcher = fs.watch(filename, function () {
     connection.write( "File '" + filename + "' changed: " + Date.now() + "\n" );
   });
   connection.on( 'close' , function () {
     watcher.close();
   });
 });
server.listen(5432, function () {
 console.log( 'Listening for subscribers...' );
});
telnet localhost 5432
if you need processes on the same computer to communicate, Unix sockets offer a more efficient alternative.
server.listen('/tmp/watcher.sock', function() {
 console.log('Listening for subscribers...');
});
nc -U /tmp/watcher.sock
connection.write(JSON.stringify({
 type: 'watching' ,
 file: filename
}) + '\n' ); connection.write(JSON.stringify({
 type: 'changed' ,
 file: filename,
 timestamp: Date.now()
}) + '\n' );
client = net.connect({port: 5432});
client.on( 'data' , function (data) {
let message = JSON.parse(data);
if (message.type === 'watching' ) {
 console.log( "Now watching: " + message.file);
} else if (message.type === 'changed' ) {
  let date = new Date(message.timestamp);
 console.log( "File '" + message.file + "' changed at " + date);
} else {
  throw Error( "Unrecognized message type: " + message.type);
}
});
HOW TO TEST
server = net.createServer( function (connection) {
 connection.write(
    '{"type":"changed","file":"targ'
 );
  // after a one second delay, send the other chunk
  let timer = setTimeout( function (){
   connection.write( 'et.txt","timestamp":1358175758495}' + "\n" );
   connection.end();
 }, 1000);
  // clear timer when the connection ends
 connection.on( 'end' , function (){
   clearTimeout(timer);
   console.log( 'Subscriber disconnected' );
 });
});
server.listen(5432, function () {
console.log( 'Test server listening for subscribers...' );
});
events = require( 'events' ),
util = require( 'util' ),
LDJClient = function (stream) {
 events.EventEmitter.call(this);
  let
   self = this,
   buffer = '' ;
 stream.on( 'data' , function (data) {
   buffer += data;
    let boundary = buffer.indexOf( '\n' );
    while (boundary !== -1) {
      let input = buffer.substr(0, boundary);
     buffer = buffer.substr(boundary + 1);
     self.emit( 'message' , JSON.parse(input));
     boundary = buffer.indexOf( '\n' );
   }
 });
};
util.inherits(LDJClient, events.EventEmitter);
In JavaScript, the value of this is assigned inside each function when it is invoked, at runtime. The value of this is not tightly bound to any particular object like in classical languages. It’s more like a special variable.
exports.LDJClient = LDJClient;
exports.connect = function (stream){
  return new LDJClient(stream);
};

netClient = net.connect({ port: 5432 }),
ldjClient = ldj.connect(netClient);

SEPARATING CONCERNS
brew install zmq
npm install zmq
node --harmony -p -e 'require("zmq")'
publisher = zmq.socket( 'pub' ),
publisher.send(JSON.stringify({
 type: 'changed' ,
 file: filename,
 timestamp: Date.now()
}));
publisher.bind( 'tcp://*:5432' , function (err) {
 console.log( 'Listening for zmq subscribers...' );
});

subscriber = zmq.socket( 'sub' );
// subscribe to all messages
subscriber.subscribe( "" );
subscriber.on( "message" , function (data) {
  let
   message = JSON.parse(data),
   date = new Date(message.timestamp);
 console.log( "File '" + message.file + "' changed at " + date);
});
subscriber.connect( "tcp://localhost:5432" );
Responding to Requests
 responder = zmq.socket( 'rep' );
// handle incoming requests
responder.on( 'message' , function (data) {
  // parse incoming message
  let request = JSON.parse(data);
 console.log( 'Received request to get: ' + request.path);
  // read file and reply with content
 fs.readFile(request.path, function (err, content) {
   console.log( 'Sending response content' );
   responder.send(JSON.stringify({
     content: content.toString(),
     timestamp: Date.now(),
     pid: process.pid
   }));
 });
});
// listen on TCP port 5433
responder.bind( 'tcp://127.0.0.1:5433' , function (err) {
 console.log( 'Listening for zmq requesters...' );
});
requester = zmq.socket( 'req' );
// handle replies from responder
requester.on( "message" , function (data) {
let response = JSON.parse(data);
console.log( "Received response:" , response);
});
requester.connect( "tcp://localhost:5433" );
// send request for content
console.log( 'Sending request for ' + filename);
requester.send(JSON.stringify({
path: filename
}));
There is a catch to using ØMQ REP/REQ socket pairs with Node. Each endpoint of the application operates on only one request or one response at a time. There is no parallelism.

Routing and Dealing Messages
A ROUTER socket uses these frames to route each reply message back to the connection that issued the request.

Node.js uses a single-threaded event loop, so to take advantage of multiple cores or multiple processors on the same computer, you have to spin up more Node processes.
const cluster = require( 'cluster' );
if (cluster.isMaster) {
  // fork some worker processes
  for ( let i = 0; i < 10; i++) {
   cluster.fork();
 }
} else {
  // this is a worker process, do some work
}
cluster.on( 'online' , function (worker) {
 console.log( 'Worker ' + worker.process.pid + ' is online.' );
});
cluster.on( 'exit' , function (worker, code, signal) {
 console.log( 'Worker ' + worker.process.pid + ' exited with code ' + code);
});
const
  cluster = require('cluster'),
  fs = require('fs'),
  zmq = require('zmq');if (cluster.isMaster) {  // master process - create ROUTER and DEALER sockets, bind endpoints
  let
    router = zmq.socket('router').bind('tcp://127.0.0.1:5433'),    dealer = zmq.socket('dealer').bind('ipc://filer-dealer.ipc');  // forward messages between router and dealer
  router.on('message', function() {
    let frames = Array.prototype.slice.call(arguments);
    dealer.send(frames);
  });  dealer.on('message', function() {
    let frames = Array.prototype.slice.call(null, arguments);
    router.send(frames);
  });  // listen for workers to come online
  cluster.on('online', function(worker) {
    console.log('Worker ' + worker.process.pid + ' is online.');
  });  // fork three worker processes
  for (let i = 0; i < 3; i++) {
    cluster.fork();
  }} else {  // worker process - create REP socket, connect to DEALER
  let responder = zmq.socket('rep').connect('ipc://filer-dealer.ipc');  responder.on('message', function(data) {    // parse incoming message
    let request = JSON.parse(data);
    console.log(process.pid + ' received request for: ' + request.path);    // read file and reply with content
    fs.readFile(request.path, function(err, data) {
      console.log(process.pid + ' sending response');
      responder.send(JSON.stringify({
        pid: process.pid,
        data: data.toString(),
        timestamp: Date.now()
      }));
    });
});
}

Labels

Review (572) System Design (334) System Design - Review (198) Java (189) Coding (75) Interview-System Design (65) Interview (63) Book Notes (59) Coding - Review (59) to-do (45) Linux (43) Knowledge (39) Interview-Java (35) Knowledge - Review (32) Database (31) Design Patterns (31) Big Data (29) Product Architecture (28) MultiThread (27) Soft Skills (27) Concurrency (26) Cracking Code Interview (26) Miscs (25) Distributed (24) OOD Design (24) Google (23) Career (22) Interview - Review (21) Java - Code (21) Operating System (21) Interview Q&A (20) System Design - Practice (20) Tips (19) Algorithm (17) Company - Facebook (17) Security (17) How to Ace Interview (16) Brain Teaser (14) Linux - Shell (14) Redis (14) Testing (14) Tools (14) Code Quality (13) Search (13) Spark (13) Spring (13) Company - LinkedIn (12) How to (12) Interview-Database (12) Interview-Operating System (12) Solr (12) Architecture Principles (11) Resource (10) Amazon (9) Cache (9) Git (9) Interview - MultiThread (9) Scalability (9) Trouble Shooting (9) Web Dev (9) Architecture Model (8) Better Programmer (8) Cassandra (8) Company - Uber (8) Java67 (8) Math (8) OO Design principles (8) SOLID (8) Design (7) Interview Corner (7) JVM (7) Java Basics (7) Kafka (7) Mac (7) Machine Learning (7) NoSQL (7) C++ (6) Chrome (6) File System (6) Highscalability (6) How to Better (6) Network (6) Restful (6) CareerCup (5) Code Review (5) Hash (5) How to Interview (5) JDK Source Code (5) JavaScript (5) Leetcode (5) Must Known (5) Python (5)

Popular Posts