Key Points
- Nodejs fits web server, batch, script program requirements
References
Key Concepts
NVM - Node Version Manager
https://github.com/nvm-sh/nvm#installing-and-updating
set the NVM default version then always run
nvm use default
can set this in the user profile script: .bashrc in the Home directory
Node, NPM support install
uninstall globals before installing NVM to manage Node, NPM
https://docs.npmjs.com/cli/v6/commands/npm-uninstall/
to remove the NPM global package
npm uninstall -g
Use NVM command ( in user/.bashrc ) to install Node, NPM
https://bytearcher.com/articles/ways-to-get-the-latest-node.js-version-on-a-mac/
Ubuntu NVM install
https://askubuntu.com/questions/426750/how-can-i-update-my-nodejs-to-the-latest-version
Uninstall Nodejs on MACOS
Options to Update Node.js
https://phoenixnap.com/kb/update-node-js-version
Use Node config module to manage environment variables
https://zetcode.com/javascript/nodeconfig/
Node-config tutorial shows how to create configuration files for Node applications with node-config module.
Node-config
Node-config creates configuration files for application deployments.
Node-config allows us to define a set of default parameters and extend them for different deployment environments (development, qa, staging, production, etc.).
The configuration files are located in the default config
directory. The location can be overriden with the NODE_CONFIG_DIR
environment variable. The NODE_ENV
environment variable contains the name of our application's deployment environment; it is development
by default.
Node config supports various configuration file formats, including JSON, YAML, properties, or XML. The default configuration file is default.json
(or default.yaml
, default.xml
). If we use a production deployment, then the configuration is loaded from production.json
.
Setting up Node-config
First, we install node-config
.
$ node -v v11.5.0
We use Node version 11.5.0.
$ npm init -y
We initiate a new Node application.
$ npm i config
We install node-config with nmp i config
.
$ npm i js-yaml
In addition, we install js-yaml
for YAML support.
Node-config example
The following example retrieves configuration data with config
package.
{ "app": { "port": 3000 }, "db": { "host": "localhost", "port": 27017, "name": "ydb" } }
We have default.json
in the config
directory.
const config = require('config'); let appPort = config.get('app.port'); console.log(`Application port: ${appPort}`); let dbHost = config.get('db.host'); console.log(`Database host: ${dbHost}`); let dbPort = config.get('db.port'); console.log(`Database port: ${dbPort}`); let dbName = config.get('db.name'); console.log(`Database name: ${dbName}`); console.log('NODE_ENV: ' + config.util.getEnv('NODE_ENV'));
We load the config
package and get the values with config.get()
function. The default deployment type is specified in NODE_ENV
.
$ node simple.js Application port: 3000 Database host: localhost Database port: 27017 Database name: ydb NODE_ENV: development
This is the output.
Node-config example II
We change the configuration file to YAML and set a production deployment environment.
app: port: 3000 db: host: localhost port: 27017 name: ydb
This is default.yaml
file.
app: port: 3300 db: host: localhost port: 27017 name: mydb
This is production.yaml
file.
const config = require('config'); let appPort = config.get('app.port'); console.log(`Application port: ${appPort}`); ...
The simple.js
file is the same.
$ node simple.js Application port: 3000 Database host: localhost Database port: 27017 Database name: ydb NODE_ENV: development
This is the output for the default environment. The configuration is loaded from default.yaml
.
$ set NODE_ENV=production $ node simple.js Application port: 3300 Database host: localhost Database port: 27017 Database name: mydb NODE_ENV: production
We change the NODE_ENV
variable with the set
command. (Use export
on Linux.) Now the configuration data is loaded from production.yaml
file.
In this tutorial, we have used node-config
package to create configuration files for our Node.js application.
NPM config package to configure different runtime environments
https://www.npmjs.com/package/config
Node-config organizes hierarchical configurations for your app deployments.
It lets you define a set of default parameters, and extend them for different deployment environments (development, qa, staging, production, etc.).
Configurations are stored in configuration files within your application, and can be overridden and extended by environment variables, command line parameters, or external sources.
This gives your application a consistent configuration interface shared among a growing list of npm modules also using node-config.
Project Guidelines
- Simple - Get started fast
- Powerful - For multi-node enterprise deployment
- Flexible - Supporting multiple config file formats
- Lightweight - Small file and memory footprint
- Predictable - Well tested foundation for module and app developers
Quick Start
The following examples are in JSON format, but configurations can be in other file formats.
Install in your app directory, and edit the default config file.
$ npm install config
$ mkdir config
$ vi config/default.json
{
// Customer module configs
"Customer": {
"dbConfig": {
"host": "localhost",
"port": 5984,
"dbName": "customers"
},
"credit": {
"initialLimit": 100,
// Set low for development
"initialDays": 1
}
}
}
Edit config overrides for production deployment:
$ vi config/production.json
{
"Customer": {
"dbConfig": {
"host": "prod-db-server"
},
"credit": {
"initialDays": 30
}
}
}
Use configs in your code:
const config = require('config');
//...
const dbConfig = config.get('Customer.dbConfig');
db.connect(dbConfig, ...);
if (config.has('optionalFeature.detail')) {
const detail = config.get('optionalFeature.detail');
//...
}
config.get()
will throw an exception for undefined keys to help catch typos and missing values. Use config.has()
to test if a configuration value is defined.
Start your app server:
$ export NODE_ENV=production
$ node my-app.js
Running in this configuration, the port
and dbName
elements of dbConfig
will come from the default.json
file, and the host
element will come from the production.json
override file.
Edit Nodejs in VSCode Tutorial
https://code.visualstudio.com/docs/nodejs/nodejs-tutorial
Simple Nodejs Examples to run Server
https://stackabuse.com/how-to-start-a-node-server-examples-with-the-most-popular-frameworks/
Nodejs application frameworks
https://softwareontheroad.com/nodejs-frameworks/
nodejs-app-frameworks-examples-The Best 10 Nodejs Frameworks for 2019
MANY Nodejs basic tips and tutorials
https://www.freecodecamp.org/news/search/?query=nodejs
https://flaviocopes.com/node-read-csv/
nodejs-servers-How to Start a Node Server Examples with the Most Popular Frameworks.pdf
nodejs-the-complete-guide-course-at-udemy-.pdf
https://www.freecodecamp.org/news/how-to-build-a-todo-app-with-react-typescript-nodejs-and-mongodb/
https://www.freecodecamp.org/news/create-a-professional-node-express/
https://www.freecodecamp.org/news/node-js-production-checklist/
https://www.freecodecamp.org/news/stripe-and-node-js-4-best-practices-and-examples/
Node.js debug
https://www.freecodecamp.org/news/watch-sockets-with-this-grunt-plugin/
https://www.freecodecamp.org/news/node-js-debugging/
Node AWS
https://www.freecodecamp.org/news/how-to-reuse-packages-with-aws-lambda-functions-using-amplify/
Nodejs Basics
https://www.w3schools.com/nodejs/ref_modules.asp
includes crypto,
CLI args - yargs option for parms
https://nodejs.org/en/knowledge/command-line/how-to-parse-command-line-arguments/
how yargs
handles your arguments. Here's a quick reference:
argv.$0
contains the name of the script file which is executed like:'$0': 'myapp.js'
.argv._
is an array containing each element not attached to an option(or flag) these elements are referred ascommands
in yargs.- Individual options(flags) become properties of
argv
, such as withargv.h
andargv.time
. Note that non-single-letter flags must be passed in as--flag
like:node myapp.js --time
.
A summary of elements used in the program:
- argv: This is the modified
process.argv
which we have configured with yargs. - command(): This method is used to add commands, their description and options which are specific to these commands only, like in the above code
lyr
is the command and-y
is lyr specific option:node myapp.js lyr -y 2016
- option(): This method is used to add global options(flags) which can be accessed by all commands or without any command.
- help(): This method is used to display a help dialogue when
--help
option is encountered which contains description of all thecommands
andoptions
available. - alias(): This method provides an alias name to an option, like in the above code both
--help
and-h
triggers the help dialogue.
For more information on yargs and the many, many other things it can do for your command-line arguments, please visit http://yargs.js.org/docs/
Node.js CLI app with OAuth security
https://developer.okta.com/blog/2019/06/18/command-line-app-with-nodejs
node-cli-ex-oauth-developer.okta-Build a Command Line Application with Nodejs.pdf
Modules - builtin and custom
https://www.w3schools.com/nodejs/nodejs_modules.asp
Create a module that returns the current date and time:
exports.myDateTime = function () {
return Date();
};
Use the exports
keyword to make properties and methods available outside the module file.
Save the code above in a file called "myfirstmodule.js
Include modules with require
var http = require('http');
var dt = require('./myfirstmodule');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
res.write("The date and time are currently: " + dt.myDateTime());
res.end();
}).listen(8080);
Http module to create a server
Need a header record for type
Pass req, res variables to set request, get response
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/html'});
res.write(req.url);
res.end();
}).listen(8080);
fs File system module
https://www.w3schools.com/nodejs/nodejs_filesystem.asp
Create files
The File System module has methods for creating new files:
fs.appendFile()
fs.open()
fs.writeFile()
Update files
The File System module has methods for updating files:
fs.appendFile()
fs.writeFile()
The fs.appendFile()
method appends the specified content at the end of the specified file:
Delete a file
var fs = require('fs');
fs.unlink('mynewfile2.txt', function (err) {
if (err) throw err;
console.log('File deleted!');
})
Upload File Module - Formidable
https://www.w3schools.com/nodejs/nodejs_uploadfiles.asp
Include the Formidable module to be able to parse the uploaded file once it reaches the server.
When the file is uploaded and parsed, it gets placed on a temporary folder on your computer.
Reading and Write file example
Stackbase example - Nodejs read, write csv files
https://stackabuse.com/reading-and-writing-csv-files-in-nodejs-with-node-csv/
install the entire node-csv
suite:
npm install node-csv
Read csv file with fileStream and Callback api
csv-parse
package provides multiple approaches for parsing CSV files - using callbacks, a stream + callback as well as the Sync and Async API. We'll be covering the stream + callback API and the Sync API.
var fs = require('fs');
var parse = require('csv-parse');
var parser = parse({columns: true}, function (err, records) {
console.log(records);
});
fs.createReadStream(__dirname+'/chart-of-accounts.csv').pipe(parser);
, we create a parser
which accepts an object literal, containing the options we'd like to set. The second argument is the callback function that's used to access the records - or just print them out, in our case.
Options to specify for the fs include:
The delimiter option defaults to a comma
,
. If the data from the file you’re trying to parse uses some other delimiter like a semi-colon;
, or a pipe|
, you can specify that with this option.The cast option defaults to
false
and is used to indicate whether you want to cast the strings to their native data types. For example, a column that is made up of date fields can be cast into aDate
.The columns option is to indicate whether you want to generate the record in the form of object literals. By default, this column is set to
false
and records are generated by the parser in the form of arrays. If set totrue
, the parser will infer the column name from the first line.
Read file using Sync API and promises
var fs = require('fs').promises;
var parse = require('csv-parse/lib/sync');
(async function () {
const fileContent = await fs.readFile(__dirname+'/chart-of-accounts.csv');
const records = parse(fileContent, {columns: true});
console.log(records)
})();
we're creating an async
function, in which we retrieve the contents of the file by await
ing the response of the readFile()
function.
Then, we can create a parser
which takes in the file contents as the first argument and an object literal as the second. This object literal contains options for creating the parser (we've set columns
to true
). This parser is assigned to a constant variable and we simply print its contents out for brevity's sake:
Write file using csv-stringify
from the node-csv
suite. Stringification just means that we'll convert some data (JSON in our example) into a string. This string is then written to a file, in CSV format.
The csv-stringify
package also has a couple of API options, though, the Callback API offers a really simple way to stringify data, without the need to handle events like with the Stream API.
reading csv file with Stringifier
stringifier.on('readable', function(){
let row;
while(row = stringifier.read()){
data.push(row)
}
})
URL module
https://www.w3schools.com/nodejs/nodejs_url.asp
Parse an address with the url.parse()
method, and it will return a URL object with each part of the address as properties:
var url = require('url');
var adr = 'http://localhost:8080/default.htm?year=2017&month=february';
var q = url.parse(adr, true);
console.log(q.host); //returns 'localhost:8080'
console.log(q.pathname); //returns '/default.htm'
console.log(q.search); //returns '?year=2017&month=february'
var qdata = q.query; //returns an object: { year: 2017, month: 'february' }
console.log(qdata.month); //returns 'february'
NPM package manager installed with Node.js
https://www.w3schools.com/nodejs/nodejs_npm.asp
NPM is a package manager for Node.js packages, or modules if you like.
www.npmjs.com hosts thousands of free packages to download and use.
The NPM program is installed on your computer when you install Node.js
A package in Node.js contains all the files you need for a module.
Modules are JavaScript libraries you can include in your project.
npm install mypackage
Events module
https://www.w3schools.com/nodejs/nodejs_events.asp
you can create-, fire-, and listen for- your own events.
Similar to Java beans where events are built-in with create event, addListener, fireEvent
Objects in Node.js can fire events, like the readStream object fires events when opening and closing a file:
Readstream example on file open
var fs = require('fs');
var rs = fs.createReadStream('./demofile.txt');
rs.on('open', function () {
console.log('The file is open');
});
Create eventEmitter
var events = require('events');
var eventEmitter = new events.EventEmitter();
Email module
https://www.w3schools.com/nodejs/nodejs_email.asp
NestJS - framework for building efficient, scalable Node.js server-side applications.
Flexible app context
https://docs.nestjs.com/application-context
There are several ways of mounting the Nest application. You can create either a web app, microservice or just a Nest application context. Nest context is a wrapper around the Nest container, which holds all instantiated classes. We can grab an existing instance from within any imported module directly using application object. Hence, you can take advantage of the Nest framework everywhere, including CRON jobs and even build a CLI on top of it.
Getting started#
In order to create a Nest application context, we use the following syntax:
async function bootstrap() {
const app = await NestFactory.createApplicationContext(ApplicationModule);
// application logic...
}
bootstrap();
Afterward, Nest allows you to pick any instance registered within Nest application. Let's imagine that we have a TasksService
in the TasksModule
. This class provides a set of usable methods, which we want to call from within CRON job.
const app = await NestFactory.create(ApplicationModule);
const tasksService = app.get(TasksService);
And that's it. To grab TasksService
instance we had to use get()
method. We didn't have to go through entire modules tree, the get()
method act like a query that search for an instance in each registered module automatically. However, if you prefer a strict context checking, you can always switch to it using strict: true
options object that has to be passed as the second argument of get()
method. Then, you have to go through all modules to pick up a particular instance from the selected context.
Express request routing
Example from Piotr using a common sdk on FE, BE
Nodejs courses for MongoDB development
Mongoose and schema access to MongoDb
Populate on query result
’ll check the endpoint.
This should be setting sellingPrice
as a priceSchema
which is:
{
dollars: Number,
cents: Number (optional),
currency: 'usd' (default)
}
Simple Web Sockets example
https://drive.google.com/file/d/1jdcAQrY3zLP3AiSu0DCSx0w2F38KjDmq/view
Conceptually, the WebSocket protocol is an extension to HTTP which allows clients to“upgrade” an HTTP connection with an additional bi-directional connection which remainsestablished, like this:
- Client opens HTTP connection to server and asks for document
- Server responds with an HTML document
- HTTP connection is closed
- JavaScript code in the HTML document opens another HTTP connection in which it asks the server to upgrade this connection to a WebSocket connection
- A WebSocket connection between client and server is established and remains open forsending and receiving data in both directions
Technically, a WebSocket connection is simply a TCP connection on port 80, just like normalHTTP connections are - with the difference that client and server treat the connection in aspecial way.
Server code
index.js
Client code
index.html
To build the example ...
after installing node and npm
mkdir wstest
cd wstest
node install // creates the application in wstest
// in a text editor, create index.js
// in a test editor, create index.html
To run example ....
Start node server
cd /blearn/js
node wstest\index.js
Start client web page
In firefox, enter the url for the server to load the HTML page
The page will be updated from the server with new stock prices periodically
Javascript ( ES17 ) Async Await Support
https://medium.com/javascript-in-plain-english/async-await-javascript-5038668ec6eb
await for function completion must be inside a function with async support as shown below
await is a new operator used to wait for a promise to resolve or reject. It can only be used inside an async function.
Promise.all returns an array with the resolved values once all the passed-in promises have resolved.
To run example ...
node test1.js
Javascript promises support example
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/all
Promise.all(iterable);Promise.all
waits for all fulfillments (or the first rejection).
Parameters
Section
iterable An iterable object such as an Array.
Return value
Section
- An already resolved
Promise
if the iterable passed is empty. - An asynchronously resolved
Promise
if the iterable passed contains no promises. Note, Google Chrome 58 returns an already resolved promise in this case. - A pending
Promise
in all other cases. This returned promise is then resolved/rejected asynchronously (as soon as the stack is empty) when all the promises in the given iterable have resolved, or if any of the promises reject. See the example about "Asynchronicity or synchronicity of Promise.all" below. Returned values will be in order of the Promises passed, regardless of completion order.
create file PromiseTest.js with the code above
To run example
node PromiseTest.js
output =
> Array [3, 42, "foo"]
Nodejs read and write to MongoDB example
/*
nt-mongo-atlas2.js
https://docs.mongodb.com/drivers/node/quick-start
https://docs.mongodb.com/drivers/node/
https://docs.mongodb.com/drivers/node/usage-examples
find by ObjectId ex
https://docs.mongodb.com/drivers/node/usage-examples/findOne
https://kb.objectrocket.com/mongo-db/nodejs-mongodb-find-by-id-686
https://docs.mongodb.com/drivers/node/usage-examples/updateOne
https://docs.mongodb.com/drivers/node/usage-examples/replaceOne
https://wesbos.com/destructuring-objects
*/
var atitle = "\n nt-mongo-atlas2.js - test mongo read write to atlas \n";
console.log(`${atitle}`);
var astep = "start";
const { MongoClient } = require("mongodb");
const uri = "mongodb+srv://dbAdmin:u3KpFTzZBEIEQLzY@cluster0.pfjbe.mongodb.net/test-db";
const connectOptions = {useUnifiedTopology: true};
const client = new MongoClient(uri,connectOptions);
const { ObjectId } = require("mongodb").ObjectId;
var collection = null;
var device = null;
var database = null;
var storageKWH, _id, deviceName, deviceGroup, sumKWH;
async function run() {
try {
await client.connect();
database = client.db("test-db");
collection = database.collection("devices");
await queryDevice();
var {_id, deviceName, deviceGroup, storageKWH} = device;
console.log(`device found = ${deviceName} _id: ${_id}`);
sumKWH = storageKWH + 75;
console.log(` after energy load, sum of storageKWH = ${sumKWH}`);
await updateDevice(device, sumKWH);
console.log(`device energy load completed`);
} finally {
await client.close();
}
};
// https://docs.mongodb.com/drivers/node/usage-examples/findOne
async function queryDevice() {
console.log(`queryDevice called`);
// Query for a movie that has the title 'The Room'
// var query = { deviceName: "solar panel 3000 MW v4" };
var oid1 = new ObjectId("6000908b07564302da69e742");
var query = { _id: oid1};
var options = {
// sort matched documents in descending order by rating
sort: { deviceName: -1 },
// Include only the `title` and `imdb` fields in the returned document
projection: { _id: 1, deviceName: 1, deviceGroup: 1, storageKWH: 1 },
};
// console.log(`query = ${query}`);
device = await collection.findOne(query, options);
// since this method returns the matched document, not a cursor, print it directly
};
// https://docs.mongodb.com/drivers/node/usage-examples/updateOne
async function updateDevice(device, sumKWH) {
var {_id, deviceName, deviceGroup, storageKWH} = device;
var oid1 = new ObjectId(_id);
var filter = {_id: oid1}
var options = {
upsert: true
};
var updateDoc = {
$set: {
storageKWH: sumKWH,
energyStatus: "loaded"
},
};
const result = await collection.updateOne(filter, updateDoc, options );
console.log(`updateDevice "${deviceName}"" matched ${result.matchedCount} count completed with ${sumKWH}`);
};
run().catch(console.dir);
Key Libraries for NodeJS
https://leanylabs.com/blog/npm-packages-for-nodejs/
https://docs.google.com/document/d/1qruyE1PxGCUyv0s7ggYFw59Jd1WizVTv7chmjMIBDoc/edit?usp=sharing
Other NodeJS libraries
MD5 hash
https://gist.github.com/kitek/1579117
var data = "do shash'owania";
var crypto = require('crypto');
crypto.createHash('md5').update(data).digest("hex");
https://ourcodeworld.com/articles/read/1547/how-to-create-md5-hashes-in-javascript
// contains: "80244576c6c4e060a8e14b124cebaaa4"
// md5("key", "value")
let hash = md5("carlos", "ourcodeworld");
Potential Value Opportunities
Potential Challenges
Node app tests
- using nvm, install node, npm
- in GitHub, select a repo branch > open terminal on repo >
- source ./.bashrc ( to set node env )
- npm install // installs correct mods
- npm start // should open first page in Chrome
NPM start with environment values
NPM ( or Yarn ) can take NODE_ENV values on the cmd line
NODE_ENV=devLocal PORT=3030 npm start
NPM start from shell script passing a config file for environment
use of custom scripts in ebc-frontend app ok
runnpm.sh -f config/.env.dev
runnpm.sh -f config/.env.devLocal
runnpm.sh -f config/.env.qa
> npm start
> load the config for the environment
.env.dev
ENV_NAME=dev
Node app web page submit fails with CORS error
has been blocked by CORS policy
Access to XMLHttpRequest at 'https://ebc-api-dev.sysopsnetwork.com/authentication' from origin 'http://localhost:3000' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource.
What domain are you making your XMLHttpRequest
from and to? CORS is a client-side operation; first you need to identify (1) what page is making the blocked XMLHttpRequest
and then where it is making the request to. The error you pasted seem incomplete, so I'm not clear on how to help. Also it would help if you also included which domain the above Express code is running under as well.
https://github.com/expressjs/cors/issues/184
example CORS access control error
I am making request from UI(https://localhost://3200) to Node js route(/login) running on (localhost://8090), after hitting node js I am making a call to Single-sign-on using SAML methods, in respone I am receiving URL(https://sso.dol.gov), after receiving URL I am using response.redirect(https://sso.dol.gov). This is when I am getting error.
@dougwilson
Member
dougwilson commented on Jan 28, 2020
Gotcha. So which one of those parts are using XMLHttpRequest? It sounds to me based on your description like the XMLHttpRequest is the request to localhost:8090, which is redirected to sso.dol.gov. If this is the case, the sso.dol.gov domain is what needs CORS policy applied to it to allow calls from localhost:3200.
If this is the case, the sso.dol.gov domain is what needs CORS policy applied to it to allow calls from localhost:3200.
Handling CORS error in Nodejs
https://stackabuse.com/handling-cors-with-node-js/
Candidate Solutions
Magic Nodejs REST Angular CRUD generator - 600 per server
see demo - looks good
server pricing may be an issue except for smaller accounts on cloud or on-prem
Magic will read metadata from your database. It will use this metadata to generate an HTTP REST Web API for you, wrapping all CRUD operations inside of REST endpoints. Then it will use metadata from the REST API to automatically generate Angular components, router links, menu items, datagrids, etc. The end result becomes that before you’ve even had to create as much as a single line of code yourself, 90% of your job is already done.
The resulting frontend gives you a datagrid for all your database tables, allowing you to create, read, update and delete records in your database. It also automatically creates paging, sorting, and filtering for you - In addition to providing you with a very, very, very secure authentication and authorisation system.
Magic is built in .Net Core, and allows you to add C# controllers to it easily. It’s a highly modular architecture, allowing you to intercept its core, using adapters and triggers. The Angular code also perfectly follows all TSLint rules, making it highly readable and easily modified.
sample record edit screen
Prereqs
The simplest way to get started, is to download its latest release, and use it as a "starter kit". You will need.
- .Net Core CLI version 3.x or more
- Some sort of database server Mysql or MS SQL
- NodeJS
- Install Angular in a terminal window with
npm install -g @angular/cli
- Magic
To start the Magic dashboard type dotnet run
in the "magic.backend" folder. Then type npm link
in your "frontend" folder. Start the Magic dashboard by typing ng serve
in the "frontend" folder. Go to http://localhost:4200.
Documentation
Although documentation is work in progress, you can rapidly teach yourself the basics here.
License
Although most of Magic's source code is publicly available, Magic is not Open Source or Free Software. You have to obtain a valid license key to install it in production, and I normally charge a fee for such a key. You can obtain a license key here. Notice, 5 hours after you put Magic into production, it will stop working, unless you have a valid license for it.
Profound.js for IBMi brings Nodejs services to iSolutions
https://www.profoundlogic.com/profoundjs/
Quickly and easily add powerful Node.js capabilities to your IBM i business apps
Free products
Node.js Framework
Our free Node.js Framework (available on npm) makes your applications Enterprise-ready by:
- Providing RPG-like capabilities for Node.js
- Enabling top-down transactional business programming capabilities
- Streamlining development and avoiding ‘Callback Hell’
- Offering seamless integration with the Profound UI framework for a robust Rich User Interface
Visual Designer
This drag-and-drop IDE makes it easy to design stunning Web or mobile application interfaces using Node and is available for free on npm.
Billable products
Profound.js Connector
Supports iterative, incremental development, and enables your business to see immediate ROI. The Connector provides:
- The ability for existing RPG programs to directly call Node.js modules, and vice versa
- Direct integration between Node.js user interfaces and RPG program displays
- Database drivers for IBM i to support robust, top-down integration with SQL or Record Level Access methods
- Access to a variety of IBM i resources, such as data areas and low level system API
JumpStart
JumpStart enables users to automatically generate user interfaces for Node.js applications. These Web and mobile templates eliminate the need to start a project from scratch.
Profound.js Converter
Our conversion tool automates the majority of the RPG to Node.js conversion and Profound Logic experts handle the rest, if needed. Unlike most conversions that often produce more convoluted code, Profound.js simplifies programs, and generally creates less lines of code than the original RPG.
Profound.js Getting started
https://www.profoundlogic.com/try-profoundjs/
Option 1: Try Profound.js Online @ NodeRun.com
NodeRun is our new cloud environment that provides users with their own spaces to create Node.js applications. The environment comes with a visual designer, full IDE, and has Profound.js pre-installed.
Click here to visit NodeRun
Option 2: Install Profound.js onto your IBM i (AS/400)
Harcon - Nodejs microservices integration
https://www.npmjs.com/package/harcon
harcon is a microservice solution for NodeJS/Browser giving superior abstraction layer for interoperability between entities in a highly structured and fragmented ecosystem. It allows you to design and implement complex workflows and microservices where context and causality of messages are important.
Not as powerful as an ESB but useful for multiple microservices similar to a service mesh for Nodejs
The library has a stunning feature list beyond basic messaging functionality.
Channel-agnostic: harcon represents a very abstract messaging framework allowing you to use any underlaying technology your application requires: AMQP, MQTT, Amazon SQS, NATS etc... For amqp integration, please check this: harcon-amqp For sqs integration, please check this: harcon-sqs For mqtt integration, please check this: harcon-mqtt For nats.io integration, please check this: harcon-nats
Tracking: you can monitor every message delivered (request or response) by only few lines of code
Flow control / Reproducibility: A flow of communication / messages can be halted / continued / reinitiated anytime with no effort
Free orchestration: your system can be orchestrated and distributed as you wish, message delivery is not limited to nodes or hosts
Short learning curve: no need to learn hundred of pages, communication has to be simple after all
Log-free coding: no more mixture of logging and business logic. Harcon logs all messages exchanged.
Transparent: although harcon introduces lots of complex types and structures, your code will be kept clean and pure, everything is (un)packed in the background in a transparent way
Smooth infiltration: your objects / functions will possess the necessary services via injection, no need to create complex structures and compounds
Advanced routing & listening: system fragmentation, qualified names, regular expressions, wildcards, etc.
Execution-chain: toolset to easily define execution paths between entities to relay messages and results.
Business-transaction: small helpers to manage business executions paths as (distributed) transactions.
!Note: Harcon's concept is to introduce a clean and high abstraction layer over messaging between entities. Like in case of every abstraction tool, for webapps which are simple as 1, it can be proven as a liability.
This library starts to shine in a highly structured and distributed environment.
Harcon Message Semantics
harcon distinguishes three message flow:
- request: entity A sends a message to B and requests an answer. Normal RPC model.
- inform: entity A sends a message to B and continues right away not considering whether B even received / processed the message. Signals / notifications are meant to be realised.
- delegate: entity A sends a message to B and requests an answer to a specified address. Usual delegation model.
These flows are available as services for all entities within harcon.
Unique messages
Every communication exchanged possesses the following properties (not exclusively):
- unique ID
- reference to the parent message if exists
- uniqued ID of the workflow itself
- external ID of the workflow started by an external communication involving a reference number to consider
- timestamp
Any time you sends a message or receives an answer, such objects are bypassing through the harcon system which logs and tracks all of them.
Step-by-step guide for Example
sample code block