Blog

Marshall McLuhan: The Prophet of the Digital Age
Nearsoft
Apr 05, 2024
Marshall McLuhan: The Prophet of the Digital Age
New Markets | Exploring Toronto with Nearsoft | Part #01 As the CEO of Nearsoft, a company at the forefront of technological innovation, I, Pedro Camacho, have always been inspired by visionaries who foresaw the potential of the digital era.During my recent two-week work in Toronto, Canada, I uncovered a historic gem - the building where Marshall McLuhan, "The Prophet of The Digital Age", hosted his legendary public sessions. This building was known as The University of Toronto's "Coach House Institute", renamed the "McLuhan Centre for Culture and Technology" in his honor in 2016. McLuhan Centre for Culture and Technology, Pedro Camacho April 2024, Toronto, Canada This experience offered me a profound connection to the roots of digital prophecy and McLuhan's enduring legacy.Marshall McLuhan, despite his innovative contributions to communication theory and remarkably accurate predictions about the digital age, faced certain resistance and skepticism within the academic community for several reasons.  McLuhan Centre for Culture and Technology, Pedro Camacho April 2024, Toronto, Canada Even without universal acceptance by all academics at the university, Marshall McLuhan did not let this hinder his pursuit of knowledge and innovation in the field of media theory.His resilience and dedication to exploring the intricacies of media's impact on society led him to hold open sessions at the Coach House Institute.This venue became a focal point for intellectual exchange, where McLuhan freely shared his groundbreaking ideas with both supporters and skeptics alike.The Coach House Institute sessions exemplify McLuhan's commitment to pushing the boundaries of traditional academic discourse and fostering an environment of open inquiry, irrespective of the prevailing academic skepticism of his time. Full recording of a Monday Night Seminar at Coach House Institute at Toronto University (1969/1972) His unconventional approach and sometimes enigmatic ideas contributed to this perception, inspiring me! "The more years I spend building and expanding globally Nearsoft, the more I believe that the great disruptors who find unique angles to approach complex problems are never consensual. They create friction with the status quo and invariably evoke exponential emotions in others."Pedro Camacho, April 2024 Some people fear change and make excuses not to appreciate it, while others understand and become loyal—your team, collaborators, and partners on the journey to success. McLuhan Centre for Culture and Technology, Pedro Camacho April 2024, Toronto, Canada Marshall McLuhan's work and insights hold a special place in the realm of technology and communication, offering profound reasons why technologists and the broader technology community should admire him. Here are some key reasons: Visionary Predictions McLuhan's predictions about the digital age and the internet, made long before these technologies became ubiquitous, demonstrate his exceptional foresight. His concept of the "global village" accurately foretold the interconnectedness of the world through electronic communication, a reality we live in today with the internet. Technologists can admire McLuhan for his ability to see beyond the immediate horizon of technological development and understand the broader implications for society. Understanding Media's Influence The phrase "the medium is the message" encapsulates McLuhan's belief that the technology or medium through which we communicate affects society not just by the content delivered but by changing how we interact with the world and with each other. This insight is invaluable for technologists, who create the platforms and mediums that define modern communication, reminding them to consider the broader impact of their creations on human interaction and societal structures. Interdisciplinary Approach  McLuhan's work crossed various fields, including literature, psychology, and sociology, to analyze and predict the effects of media on individuals and societies. This interdisciplinary approach is something that technologists and innovators can admire and emulate, as the most groundbreaking advancements often occur at the intersection of different fields. Emphasis on Participatory Media Long before the rise of social media, McLuhan envisioned a future where media would be participatory and users would not only consume content but also produce it. This prediction has come true in the digital age, where user-generated content dominates. Technologists can appreciate McLuhan's early recognition of this shift, which has profound implications for how technology platforms are designed and operated. Critical Perspective on Technological Determinism McLuhan's work offers a nuanced view of technological determinism, the idea that technology shapes society. His insights help technologists understand that while technology is a powerful force in societal change, how society adopts and adapts to technology is complex and multifaceted. This understanding is crucial for responsible innovation. Educational Influence McLuhan's teachings and writings have educated generations about the significance of media and technology in shaping human consciousness and societal structures. For technologists and educators alike, McLuhan's work continues to be a rich source of insights on the impact of media and technology on learning and cognition. Legacy of Innovation McLuhan's legacy is one of innovation in thought and perspective. He challenged prevailing views and encouraged a deeper examination of the role of technology in society. For technologists like my seft, admiring McLuhan means embracing a mindset that questions, innovates, and looks beyond the conventional, to understand not just how technology works, but how it changes us.In essence, Marshall McLuhan's work provides critical frameworks for understanding the implications of technological advancements, making him a figure of immense admiration for those involved in shaping the future of technology.ConclusionMy interest in scrutinizing Marshall McLuhan's thoughts is currently very high; I will delve deeper into his ideas and mainly into his multidisciplinary vision to understand many of the topics so relevant to our society today in the coming weeks.I will return shortly with more of my learnings about the city of Toronto, Canada, where NEARSOFT is expanding its network of projects and businesses.#GoNEARSOFT #Disruption #NearsoftNorthAmerica #DigitalBanking
Pedro
Pedro Camacho
CEO & Co-founder
Why Typical Signing Based on PKI Certificates Are a Pain for Users?
Software
Mar 23, 2024
Why Typical Signing Based on PKI Certificates Are a Pain for Users?
Traditional digital signing processes, rooted in Public Key Infrastructure (PKI) certificates, often present a myriad of challenges for users, making them a cumbersome and frustrating experience. The complexity begins with issuing digital certificates, which requires users to navigate through a rigorous and sometimes opaque validation process by a Certificate Authority (CA). This not only introduces delays but also adds an unnecessary layer of complexity for individuals and organizations alike. Moreover, managing PKI certificates entails significant overheads, including the need for regular renewals and the risk of certificate revocation or expiration, further complicating the user experience. Additionally, the centralized nature of PKI systems introduces potential security vulnerabilities, as they rely on the integrity and security of the issuing authorities.This centralized model contrasts sharply with the decentralized, user-friendly, and inherently secure approach offered by blockchain technology in document signing and verification processes. By moving away from PKI certificates, e-seal eliminates these pain points, offering a seamless, efficient, and more secure experience for users, and revolutionizing the way we think about and handle digital document security.The innovative e-seal from NEARSOFT empowers both individuals and organizations to safeguard their documents against fraud, all at an exceptionally affordable rate per digital seal.As we are currently in the beta phase across various institutions, we're pleased to offer special pricing conditions to our next 100 clients.For more details, please download the brochure from the e-seal website and reach out to our sales team.
Pedro
Pedro Camacho
CEO & Co-founder
Dia Internacional das Jovens Mulheres nas Tecnologias de Informação e Comunicação (TIC)
Consulting
Apr 27, 2023
Dia Internacional das Jovens Mulheres nas Tecnologias de Informação e Comunicação (TIC)
On the International Day of Girls in Information and Communication Technologies (ICT), we recognize and celebrate the crucial role of women in the world of technology. The goal is to advocate for their interests and encourage young women to explore diverse areas of ICT, as well as build a professional career in this sector.We are aware that women are still underrepresented in the technology industry and face unique challenges. Therefore, it is essential that we continue to support young women on their journey into the world of technology, providing opportunities to learn, grow, and reach their full potential. Only then can we build a more inclusive and equitable world where women can thrive in all areas.At Nearsoft, we are a technology company primarily focused on banking software development. We consistently invest in female talent as we believe it adds significant value, both to the company's growth and to society.Sara Tranquada (33 years old), Ana Alves (27 years old), and Maria Rodrigues (23 years old), Nearsoft collaborators in the Mobile, Back-End, and Design departments, respectively, share their testimonies as women interested and active in the world of ICT. Their purpose is to inspire younger generations and challenge preconceived ideas about working in the technology field.SARA TRANQUADA | 33 years | Mobile Developer at Nearsoft"Computer engineering was a course I never thought of pursuing, but now I can't imagine doing anything else. I was indecisive about which path to take after high school because everyone around me, especially my family, was engaged in occupations traditionally associated with their gender. I wanted to pursue an academic path related to technology, thinking it would provide a secure future. Surprisingly, after deciding to study Computer Engineering, I enjoyed it so much that I even pursued a master's degree.Today, an engineer is still perceived by the majority as a man. However, it hasn't always been this way. During World War II, women led as engineers, but the situation changed after the conflict. It is important, therefore, that we create initiatives to address gender bias in computer science and engineering. This was the inspiration for my doctoral thesis: 'Confronting the Number of Women in Computer Science and Engineering: Making Bias Visible.' Since then, I have earned a Ph.D. in Digital Media and am part of the Nearsoft team, where my skills are recognized in a healthy organizational environment.I found my place in the technological world, and I believe it is possible for more young women to become interested in this field. The path forward is to invest in the education of the younger generation, informing them of the importance of technology and encouraging them to explore ICT areas."ANA ALVES | 27 years | Back-end Developer at Nearsoft"Ten years ago, when asked about the course I would like to take and the career I aspired to, I never thought about Computer Engineering. 'It's a guy thing,' I often heard about engineering in general. Until someone suggested that option, and I, always wanting to challenge stereotypes since childhood, thought to myself, 'why not?'. I decided to challenge myself and venture into an unknown area. In the initial phase of the course, I faced reality: out of around 70 students, little more than 5 were girls. However, the low representation of women motivated me even more to continue, and today, I am satisfied with working on something that constantly challenges me. It is crucial that programming is taught as early as possible in schools, considering that we are moving towards an increasingly digital world. This could contribute to changing less positive perceptions regarding the role of women in information technology.Currently, I work on the backend/middleware team at Nearsoft, where I feel recognized and have never experienced any discrimination or devaluation of my work because of my gender. However, my reality is not the reality of all women. Some still face barriers to being hired or even recognized for their work, with the lingering notion that men are more qualified to work in the technology sector.To women who want to start in this field: don't be afraid or give up due to social stigmas associated with this area, predominantly occupied by men. If this is a world that somehow captivates you, take the risk. The important thing is that you enjoy learning since continuous learning is a fundamental value in the technology segment."MARIA RODRIGUES | 23 years | UI/UX Designer at Nearsoft"This path has always been of interest and curiosity to me; I have always liked design and all the areas it encompasses. During my undergraduate studies, I explored all areas of design as a form of communication. When I experienced user experience and interactive design, I was captivated by the multitude of product possibilities we could develop. I participated in projects with sensory interaction and developed digital platforms for various purposes through the design thinking process.As a woman, I always face inequalities and discrimination, but in my field, there are people of all genders in companies. I even feel that design is understood as art by the uninformed, making it a profession perceived as feminine, given that societal paradigms say that women have a greater aesthetic sense than men. In this sense, the need to combat some prejudices persists, both for women who see their skills reduced to the aesthetic field of design and for men who are not valued in the area.It is necessary to understand that design is not subjective (as something beautiful) but objective. It is a method of visual manipulation, so we have to go through a process of understanding who we are going to manipulate (geographical position, age, gender, social class, etc.) and the reaction we want them to have to the product we are creating. This is called Design Thinking (research, analysis, empathize, experiment, and test). There is a notion that design is only aesthetic, and we are often not included, as designers, in the functional decisions of the products we develop when our initial role is also to do this functional survey and analysis, the most important part of our work.At Nearsoft, I am part of the Design team, where I feel integrated and heard. My role in the company is to handle the user experience and hand off design documents to the programmers.To those interested in entering the technology field, I recommend investing in knowledge and research in the area; you have many options. The more familiar we are with the subject, the greater our chances of success."In this way, it is evident that Nearsoft is a company that values diversity in its teams and believes that the inclusion of individuals with different backgrounds, perspectives, skills, and genders is essential for success in an increasingly competitive global market. The presence of four Nearsoft women at Web Summit 2018 is a clear example of our commitment to promoting diversity and inclusion in our teams and throughout the technology industry. The four women who were part of a diverse team proved to be a significant contribution to the success of the company's participation in Europe's most important technology event. Therefore, we stand out as an example, and with a sense of responsibility, Nearsoft, as a technology company, will continue to participate in efforts to raise awareness of the important role of women in the ICT world.
Tatiana
Tatiana Andrade
Human Resources Specialist
5 Banking Trends to Watch in 2023
Finance
Apr 11, 2023
5 Banking Trends to Watch in 2023
As the banking industry continues to evolve, we are seeing new trends emerge that will shape the future of finance in 2023 and beyond. At Nearsoft, we are always at the forefront of disruption, so we’ve compiled a list of the top 5 banking trends that we believe will transform the industry in the next year. Digital Transformation Goes Mainstream: With the pandemic accelerating the adoption of digital banking, we expect to see even more banks invest heavily in technology to offer a seamless digital experience to their customers. From mobile banking apps to chatbots and AI-driven services, banks will be looking to enhance their digital capabilities to meet the changing needs of their customers. Open Banking Takes Center Stage: Open banking has been gaining traction in recent years, and we believe 2023 will be the year when it becomes a mainstream banking trend. Open banking allows customers to share their financial data securely with third-party providers, enabling them to access innovative financial products and services that were previously unavailable. As more banks embrace open banking, we expect to see an explosion of fintech startups and innovative banking solutions in the market. Blockchain Technology Disrupts Traditional Banking: Blockchain technology has the potential to revolutionize the banking industry, and we believe 2023 will be the year when we see more banks exploring its potential. Blockchain technology can enable secure, transparent, and faster transactions, eliminating the need for intermediaries and reducing costs. We expect to see banks adopting blockchain for everything from cross-border payments to trade finance and asset management. Artificial Intelligence (AI) and Machine Learning (ML) Revolutionize Banking: AI and ML are already being used in the banking industry to provide personalized customer experiences and detect fraud. In 2023, we expect to see even more banks adopting AI and ML to enhance their operations, from customer service to risk management and compliance. The use of AI and ML will enable banks to automate routine tasks, make better decisions, and improve efficiency. Customer-Centric Banking Takes Center Stage: With increasing competition from fintech startups and non-bank players, traditional banks will need to focus more on providing exceptional customer experiences to retain their customers. In 2023, we expect to see banks investing more in customer-centric initiatives, such as personalized product offerings, omnichannel experiences, and loyalty programs. Banks that can successfully differentiate themselves through customer experience will be the ones that thrive in the coming years. At Nearsoft, we are a team of experienced technologists and innovators who are passionate about leveraging emerging technologies to transform businesses. We believe that our expertise can help banks navigate these trends and embrace the disruption that is reshaping the industry. For instance, we can help banks enhance their digital capabilities by developing cutting-edge mobile banking apps and chatbots powered by AI and ML. We can also help banks leverage open banking by developing secure APIs that enable seamless data sharing with third-party providers. Additionally, our expertise in blockchain can help banks explore the potential of this technology for cross-border payments, trade finance, and asset management. Furthermore, we can help banks leverage the power of AI and ML to automate routine tasks, detect fraud, and make better decisions. Our customer-centric approach to software development can also help banks create personalized product offerings and omnichannel experiences that delight their customers. Overall, Nearsoft is well-positioned to help banks embrace the disruptive trends that are shaping the industry in 2023 and beyond. Our track record of delivering innovative solutions for clients across various industries makes us the ideal partner for banks that are looking to differentiate themselves and stay ahead of the curve.
André
André Teixeira
Head of Marketing and Sales
A guide: How to build and publish an NPM Typescript package.
Software
Sep 23, 2022
A guide: How to build and publish an NPM Typescript package.
Introduction Have you ever found yourself copy-pasting the same bits of code between different projects? This was a constant problem for me, so I started developing Typescript packages that allowed me to reuse useful pieces of code. This guide will show you step-by-step how to build a package using typescript and published in the Node Package Manager (npm). Why use typescript?   Using Typescript will give you a better development experience, it will be your best friend when developing, always “yelling” at you every single time you make a mistake. In the beginning, you may feel that strong typing decreases productivity and It's not worth it to use. But believe me when I tell you that Typescript has some serious advantages: Optional Static Typing – Types can be added to variables, functions, properties, etc. This helps the compiler and shows warnings about any potential errors in code before the package ever runs. Types are great when using libraries they let developersknow exactly what type of data is expected. Intellisense – One of the biggest advantages of Typescript is its code completion andIntellisense. Providing active hints as code is added. More robust code and easier to maintain. In my option, Typescript should be your best pal when building packages!Let’s get cooking!                                          The first step is to create your package folder picking a creative name.mkdir npm-package-guide-project && cd npm-package-guide-projectCreate a git repository.Next, let’s create a remote git repository for your package. How to create a git repository is out of the scope of this article but when you create a repository in GitHub it easily shows you how to do it. Follow the steps there then come back over here!Start your packageAfter the repository is created, you need to create a package.json. It’s a JSON file that resides in the project's root directory. The package.json holds important information. It Contains human-readable metadata about the project, like the project name and description, functional metadata like package version number, scripts to run in the CLI, and a list of dependencies required by the project.npm init -yAfter that, we need to create a .gitignore file at the root of the project. We don’t want unneeded code entering the repository ✋. For now, we only need to ignore the node_modules folder.echo "node_modules" >> .gitignoreGreat Job! This is what the project should look like in Visual Studio Code and in the git repository. From this point on I will continue adding files from vscode.Let’s add in Typescript as a DevDependencyIt will use a more stable version of typescript that is compatible with multiple packages that will be used during this guide.npm install --save-dev typescript@4.7Using the flag –-save-dev will tell NPM to install Typescript as a devDependency. This means that Typescript is only installed when you run npm install, but not when the end-user installs the package. Typescript is needed to develop the package, but it’s not needed when using the package.To compile the typescript, we need to create a tsconfig.json file at the root of the project. This file corresponds to the configuration of the typescript compiler(tsc). {  "compilerOptions": {    "outDir": "./lib",    "target": "ES6",    "module": "CommonJS",    "declaration": true,    "noImplicitAny": true  },  "include": ["src"], // which files to compile  "exclude": ["node_modules", "**/__tests__/*"] // which files to skip} There are many configuration options in the fields of tsconfig.json and it's important to be aware of what they do. target: the language used for the compiled output. Compiling to es6 will make our package compatible with browsers. module: the module manager used in the compiled output. declaration: This should be true when building a package. Typescript will then also export types of definitions together with the compiled JavaScript code so the package can be used with both typescript and JavaScript.  outDir: Compiled output will be written to his folder. include: Source files path. In this case src folder. exclude: What we want to exclude from being compiled by the ts Let’s Code! Now with Typescript compilation set up, we are ready to code a simple function that receives parameters and multiplies them, returning the operation result. For this let’s create a src folder in the root and add an index.ts file:  export const Multiplier = (val: number, val2: number) => val * val2; Then add a build script to package.json: "build": "tsc" Now just run the build command in the console:npm run buildThis will compile your Typescript and create a new folder called lib in the root with your compiled code in JavaScript and type definition.It’s needed to add the lib folder to your .gitignore file. Is not recommended for auto-generated files to go to the git remote repository as it can cause unnecessary conflicts. node_modules/lib Formatting and linting A good package should include rules for linting and formatting. This process is important when multiple people are working/contributing to the same project so that everyone is on the same page when it comes to codebase syntax and style. Like we did with Typescript, these are tools used only for the development of the package. They should be added as devDependencies.Let’s start by adding ESLint to our package:npm install eslint @typescript-eslint/parser @typescript-eslint/eslint-plugin --save-dev eslint: ESLint core library @typescript-eslint/parser: a parser that allows ESLint to understand TypeScript code @typescript-eslint/eslint-plugin: plugin with a set of recommended TypeScript rules Similar to Typescript compiler settings, you can either use the command line to generate a configuration file or create it manually in VSCode. Either way, the ESLint configuration file is needed.Create a .eslintrc file in the root:You can use the following starter config and then explore the full list of rules for your ESLint settings. {  "parser": "@typescript-eslint/parser",  "parserOptions": {    "ecmaVersion": 12,    "sourceType": "module"  },  "plugins": ["@typescript-eslint"],  "extends": ["eslint:recommended", "plugin:@typescript-eslint/recommended"],  "rules": {},  "env": {    "browser": true,    "es2021": true  },} parser: this tells EsLint to run the code through a parse when analyzing the code. plugins: define the plugins you’re using extends: tells ESLint what configuration is set to extend from. The order matters. env: which environments your code will run in Now let’s add a lint script to the package.json. Adding an --ext flag will specify which extensions the lint will have in account. By default it’s .js but we will also use .ts. "lint": "eslint --ignore-path .eslintignore --ext .js,.ts ." There is no need for some files to be linted, such as the lib folder. It’s possible to prevent linting on unnecessary files and folders by creating a .eslintignore file. node_moduleslib Now ESLint it’s up and ready! I suggest the integration of ESLint into whatever code editor you prefer to use. In VSCode, go to extensions and install the ESLint extension.To check your code using ESLint you can manually run your script in the command line.npm run lintNow let’s set up PrettierIt’s common the usage of ESLint and Prettier and the same time, so let’s add Prettier to our project:npm install --save-dev prettierPrettier doesn’t need a config file, you can simply run and use it straight away.In case you want to set your own config, you need to create a .prettierrc at the root of your project.If you are curious to know more, I'm leaving here a full list of format options and the Prettier Playground. //.prettierrc{    "semi": false,     "singleQuote": true,     "arrowParens": "avoid" } Let’s add the Prettier command to our scripts. Let’s also support all files that end .ts, .js, and .json, and ignore the same files and directories as .gitignore ( or create a file .prettierignore) ...  "scripts": {    "test": "echo \"Error: no test specified\" && exit 1",    "build": "tsc",    "lint": "eslint --ignore-path .eslintignore --ext .js,.ts .",    "format": "prettier --ignore-path .gitignore --write \"**/*.+(js|ts|json)\""  },... Now just run the command npm run format to format and fix all your code.Conflicts with ESLint and Prettier.It’s possible that Prettier and ESLint generate issues when common rules overlap. The best solution here is to use eslint-config-prettier to disable all ESLint rules that are irrelevant to code formatting, as Prettier is already good at it.npm install --save-dev eslint-config-prettierTo make it work you need to go to the .eslintrc file and add it to Prettier at the end of your extends list to disable any other previous rules from other plugins. // .eslintrc{  "parser": "@typescript-eslint/parser",  "parserOptions": {    "ecmaVersion": 12,    "sourceType": "module",  },  "plugins": ["@typescript-eslint"],  // HERE  "extends": ["eslint:recommended", "plugin:@typescript-eslint/recommended", "prettier"],  "rules": {},  "env": {    "browser": true,    "es2021": true  } With that, the format and linting section are completed! Awesome job!Setup testing with jestIn my opinion, every package should include unit tests! Let’s add Jest to help us with that.Since we are using Typescript, we also need to add ts-jest and @types/jest.npm install --save-dev jest ts-jest @types/jestCreate a file jestconfig.json in the root: //jestconfig.json {"transform": {"^.+\\.(t|j)sx?$": "ts-jest"},"testRegex": "(/__tests__/.*|(\\.|/)(test|spec))\\.(jsx?|tsx?)$","moduleFileExtensions": ["ts", "tsx", "js", "jsx", "json", "node"]} Now let’s update the old test script in our package.json file: //package.json"scripts":{..."test": "jest --config jestconfig.json" ... Your package.json file should look something like this:Let’s write a basic test! In the src folder, add a new folder named __tests__, and inside, add a file with a name you like, but must end with test.ts, for example, multiplier.test.ts. //multiplier.test.tsimport { Multiplier } from '../index'test('Test Multiplier function', () => {  expect(Multiplier(2, 3)).toBe(6)}) In this simple test, we will pass the numbers 2 and 3 as parameters throw our Multiplier function and expect that the result will be 6.Now just run it.npm testIt works! Nice job! The test passes successfully, as you can see! Meaning that our function is multiplying correctly.Packages.json magic scripts.There are many magic scripts available for use by the Node Package manager ecosystem. It’s good to automate our package as much as possible.In this section we will look at some of these scripts in npm: prepare, prepublishOnly, preversion, version, and postversion. prepare script runs when git dependencies are being installed. This script runs after prepublish and before prepublishOnly. Perfect for running building code.           "prepare" : "npm run build" prepublishOnly: this command serves the same purpose as prepublish and prepare but runs only on npm publish!        "prepublishOnly" : "npm test && npm run lint" perversion: will run before bumping a new package version. Perfect to check the code using linters.        "preversion" : "npm run lint" version: run after a new version has been bumped. If your package has a git repository, like in our case, a commit and a new version tag will be made every time you bump a new version. This command will run BEFORE the commit is made.        "version" : "npm run format && git add -A src" postversion: will run after the commit has been made. A perfect place for pushing the commit as well as the tag.        "postversion" : "git push && git push --tags"This is what my package.json looks like after implementing the new scripts:Before PublishWhen we added .gitignore to our project with the objective of not passing build files to our git repository. This time the opposite occurs for the published package, we don’t want source code to be published with the package, only build files.This can be fixed by adding the files property in package.json:  //package.json..."files":[   "lib/**/*"  ] ... Now, only the lib folder will be included in the published package!Final details on package.jsonFinally is time to prepare our package.json before publishing the package: //package.json{ "name": "npm-package-guide", "version": "1.0.0", "description": "A simple multiplier function", "main": "lib/index.js", "types": "lib/index.d.ts", "scripts": { "test": "jest --config jestconfig.json", "build": "tsc", "lint": "eslint --ignore-path .eslintignore --ext .js,.ts .", "format": "prettier --ignore-path .gitignore --write \"**/*.+(js|ts|json)\"", "prepare": "npm run build", "prepublishOnly": "npm test && npm run lint", "preversion": "npm run lint", "version": "npm run format && git add -A src", "postversion": "git push && git push --tags" }, "repository": { "type": "git", "url": "git+https://github.com/Rutraz/npm-package-guide.git" }, "keywords": [ "npm", "jest", "typescript" ], "author": "João Santos", "license": "ISC", ... These final touches to the package.json include adding a nice description, keywords, and an author. It’s important here since it will tell NPM where it can import the modules from.Commit and push your code.The time is here, to push all your work to your remote repository!git add -A && git commit -m "First commit"git pushPublish your package to NPM!To be able to publish your package, you need to create an NPM account.If you don’t have an account you can do so at https://www.npmjs.com/signupRun npm login to login into your NPM account.Then all you need to do is to publish your package with the command:npm publishIf all went smoothly now you can view your package at https://www.npmjs.com/.We got a package! Awesome work!Increase to a new version.Let’s increase the version of our package using our scripts:npm version patchIt will create a new tag in git and push it to our remote repository. Now just published again:npm publishWith that, you have a new version of your package! Congratulations! You made it to the end. Hopefully, you now know how to start building your awesome package!
João
João Santos
Mobile Developer Lead
Hey, BANKS! Do you really need Another Core Banking System?
Software
Oct 12, 2019
Hey, BANKS! Do you really need Another Core Banking System?
Our minds are formatted to assume without questioning, that technological systems that involve complex and critical operations are something always tightly crafted in a secret lab and managed by pseudo-scientists from an 80s Hollywood movie.  "complexity is the enemy of execution"Currently, in our society as a result of this common vision, in the most traditional business areas is very uncommon to ask "why do we need such complexity?".After several years working on complex software architectures for financial institutions, and since 2017 as Fintech and Software Company Founder, I truly believe that all traditional Banks are the best example of the famous quote from Tony Robins about complexity "complexity is the enemy of execution". The main reason for this conclusion has been identified and is well known to all financial services professionals, we are talking about the dependency on legacy systems.My goal when I decide to write this article was not to talk about the problems of obsolete, slow or insecure legacy systems, my goal was to share my disturbing conclusion that those responsible for the technological evolution strategy of these banks are repeating the same old mistakes when trying to remove from queue equation their actual legacy systems.dependence on a single component usually coming from the 80s and 90s that we call the Core Banking System.What is causing most of these organizations to be doomed to disappear in a shorter period of time than we can ever imagine? The answer is their dependence on a single component usually coming from the 80s and 90s that we call the Core Banking System.Most of the Core Banking Systems were developed to offer all the functionalities needed to run the Bank. And WOW... this was great for those who want to have a bank up and running quickly! In the 80s and early 90s, there was no internet for everyone, no Fintech companies disrupting the business, and most of the banking products were similar between banks.The real problems of having an old Core Banking System start when the business processes of your organization slow down every time you need small changes on the system, if you need to support channel solutions in the Digital Banking race with other banks when regulatory changes are mandatory or very specific customisation is urgent to support a new portfolio product.The Music Lover vs The Music Professional AnalogyRecently I found the perfect analogy for all this Core Banking modernization process. The music lover vs the music professional.Let me explain my analogy. If you like to listen to music at home and you don't need sound equipment to make a living, buy a hi-fi or just use your parent's hi-fi from the 90s. If you need to fix something on the old hi-fi will be hard to find a non-expensive good technician, some of the parts can also be hard to find, but the point is that you are not professional, and you like to listen to some chill music at the weekend, this old hi-fi makes what you need! This is what most of the traditional Banks that are not investing in innovation are doing, they are simply listening to some chill music and unconsciously waiting for their end. But if you're a professional musician or a DJ, you will have competition, and you need to keep your system updated, if possible a state-of-the-art system, so you can compete in the market with other musicians, in this case, you will not buy a "domestic" hi-fi, you will buy several separated professional sound hardware modules and very quality wires, all of them with standard interfaces so you can quickly upgrade or simply change parameters of sound to better respond and adapt to the space where you are playing your music. This type of system is hard to assemble, but once they are ready they are so much easy to adapt to any situation.The picture on the left is in my opinion the perfect analogy to represent a robust and very customisable integration layer of a modern Bank, where we can connect dozens or hundreds of external modules, and orchestrate and fine-tune all the final results.When a Bank with this modern view of Systems Architecture acquire or develops a new module to integrate into its Microservices Ecosystem, they already know that it will be easy to replace it in the near future, because it was built with integration and domain isolation in mind, and that is the normal life cycle of technological modules on the super-fast current Financial industry.In the end, it's all about being a professional musician or just choosing to be someone that like music and has a hi-fi box, after all, is all about end result.Why are Banks replacing old monolithic systems with new monolithic systems?One of the most difficult questions for me to answer today is why banks are replacing old monolithic systems with new monolithic systems? Even though the new systems are built using more modern tools, using less proprietary hardware architectures or even running on the cloud.Just because a vendor shows some example integration services like web services APIs, and the product catalogue you found separated modules, doesn't mean that you are not buying another monolith!In the latest 8 years, I found it more and more common to see banks replacing their Core Banking System with a new modern Core Banking System, the reason is that most of the Banks don't have on their board a real engineer or they have engineers with a short vision of systems architecture or in most of the cases the decisions are made without the feedback of the engineering team.For decision-makers with a huge ego, and afraid of technology evolutions, the most secure choice when comes to making big investments in technology transformation, is always the buy expensive and proprietary systems from big-name vendors. Normally as a result of one or more reports made for them by one of the big five consulting companies, that have a huge commercial interest in maintaining proprietary, closed and especially hard do adapt systems, so they keep selling obscene expensive contracts of development, analysis, consulting, ...Decision makers instead should focus on the real problem which is the lack of freedom to manage and rule their technological path, which is the base of the fast pace and low maintenance cost of almost all Fintech Companies.The soul of the monolith lives on that dark side where the vendor interconnects on the same code base all the business rules of different domain areas.The soul of the monolith systems lives on that dark side where the system vendor interconnects on the same code base all the business rules of different domain areas, like the "big spaghetti ball from hell". Following this approach, vendors can say that the new Core Banking System offers thousands of integrations services (web service, queues, ....), but all old problems will remain. They are just selling a new Monolith supported by modern programming languages, installed on standard hardware infrastructures or the cloud.The secret to success in this modernization process is to free your organization from the dependency on one specific vendor, where you will always need them or expensive consultancy companies that work with that product to make, analyse or do all types of tasks so your organization run their systems.Microservices Architecture is the path for Modernization and speed all business changes and evolution inside a Bank.All systems inside the Bank should respect industry input and output data structure standards and isolate small business or integration domain functions on separate software modules, the Microservices Architecture is the path for Modernization and also the basis for all business change and evolution processes inside a Bank.My advice to decision-makers working in the Financial Industry is to understand that the most important asset inside your organization is Data, and they should not depend on one specific group of providers to manage and leverage the power of your Data, so the organization can keep generating business and market opportunities.Systems Architecture is the base of their organization's flexibility to adapt quickly to market needsThose decision-makers who don't realise yet, that the investment in technology doesn't have to be the investment in specific vendors with big names, and also that Systems Architecture is the basis of their organization's flexibility to adapt quickly to market needs, will be responsible for the loss of customers and in many cases for the end of the organization. In these cases, the "safe" bet on big-name sellers will be worth nothing in their defence.
Pedro
Pedro Camacho
CEO & Co-founder

1 / 2