Monday, September 27, 2021

2021-09-27 Monday - On Writing - and Reviewing - Wielding The Blazing Sword of Conciseness

(image credit: Skitterphoto on


After finishing a normal workday - I often find myself working late into the night...either writing - or reading.

Tonight - it is the latter - burning the midnight oil - to complete my review feedback of the first four chapters of a new Manning Publications Co. book from a new author (thank you for the very kind invitation, Aleksandar Dragosavljević, [Review Editor]).

There is so much I enjoy about the craft of writing - and especially when I can merge that passion with the many nuanced concerns of architecture and design - and reviewing is just one aspect of the writing and design process.

A streaming of words from my consciousness, on writing - and reviewing: The shaping of the communication, the choice of words, the tempo and cadence - just as important as in the composition of music, the distilling of ideas, skills, knowledge, innovative improvements in how something can be accomplished, the act of improving the communication - wielding the blazing sword of conciseness - cutting to the heart of the matter - in seeking the best outcome of the written word: to inspire, to elevate an individual's ability to achieve and perform.

The title and subject matter of the book are still confidential...

Wednesday, September 22, 2021

2021-09-22 Wednesday - TensorFlow 2.0 In Action


(image source: Manning Publications, Co)

A big *Thank You* to Aleksandar Dragosavljević (Review Editor, Manning Publications Co.), for inviting me to be part of the manuscript review team for "TensorFlow 2.0 in Action", by Thushan Ganegedara.

The book is targeted for final publication in Spring 2022 - (but, you can read many of the chapters online now, as part of the Manning Early Access Program (MEAP).

"TensorFlow 2.0 in Action teaches you to use the new features of TensorFlow 2.0 to create advanced deep learning models. You’ll learn by building hands-on projects, including an image classifier that can recognize objects, a French-to-English machine translator, and even a neural network that can write fiction."

Thushan Ganegedara is a data scientist with QBE. He holds a PhD in machine learning from the University of Sydney and he has worked with TensorFlow for almost 5 years.

2021-09-22 Wednesday - Microsoft Surface Laptop Studio Announced


2021-09-22 Microsoft Event Live 

Microsoft Surface Laptop Studio will be available Oct 5th



14.4” touchscreen
Refresh rate: up to 120Hz
Resolution: 2400 x 1600 (201 PPI)
Aspect ratio: 3:2
Contrast ratio: 1500:1
quad-core powered 11th Gen Intel® Core™ H Series processor, Windows 11, and up to 32GB RAM
    Quad-core 11th Gen Intel® Core™ H35 i5-11300H
    Quad-core 11th Gen Intel® Core™ H35 i7-11370H
16GB or 32GB LPDDR4x RAM
NVIDIA’s 2nd gen RTX architecture
starting at 3.83 lbs
Two USB 4.0 with Thunderbolt™ 4 ports 
Dedicated charging port
Fast, reliable Wi-Fi 6
    Wi-Fi 6: 802.11ax compatible
    Bluetooth Wireless 5.1 technology
Hardware TPM 2.0 chip for enterprise security and BitLocker support
Enterprise-grade protection with Windows Hello face sign-in
Windows 11 Home
Preloaded Microsoft 365 Apps7
Microsoft 365 Home 30-day trial
    Ambient light sensor
    Removable solid-state drive (SSD)6 options: 256 GB, 512 GB, 1TB, 2TB
    Intel® Core™ i5 models: Intel® Iris® Xe Graphics
    Intel® Core™ i7 models: NVIDIA® GeForce RTX™3050 Ti  laptop GPU with 4GB GDDR6 GPU memory 
    Intel® Core™ i5: Up to 19 hours of typical device usage
    Intel® Core™ i7: Up to 18 hours of typical device usage
    2 x USB 4.0 with Thunderbolt™ 4 technologies support 
    3.5mm headphone jack 
    1 x Surface Connect port 
Cameras, video, audio:
    1080p resolution front facing camera.
    Dual far-field Studio Mics
    Quad Omnisonic™ speakers with Dolby Atmos
Casing: Magnesium and Aluminum


Review Articles:

Monday, September 20, 2021

2021-09-20 Monday - ERP Subject-Matter Expert (SME) Resources

(Image by Radoan Tanvir from Pixabay)


My blog post in which I've begun organizing links to some of my background reading/research related to developing SME-level of expertise in a number of ERP related domains - and some of the resources that might be of interest to others (e.g., Warehouse Management, Supply Chain Management, Logistics and Transportation, Inventory Management, Facilities Planning and Management, Operations Management).


 This week I'm starting a deep dive into Oracle Cloud ERP (21C) documentation


Oracle Fusion Cloud ERP recognized as a Leader in 2021 Gartner Magic Quadrant for Cloud Core

  • Gartner Magic Quadrant for Cloud Core Financial Management Suites for Midsize, Large and Global Enterprises   
  • Financial Management Suites for Midsize, Large and Global Enterprises
  • Applications ERP What is ERP?
  • Oracle cloud ERP leadership

Selected Oracle Cloud ERP 21C documentation links:

As part of my background reading and research - to expand my mental model and expertise as a Subject-Matter Expert (SME) in ERP - and the variously related domains - I'm also assembling a list of additional resources (books, articles, videos, podcasts, etc.)


YouTube Videos, re: Oracle ERP Cloud:

Channel: MIT

Channel: Oracle

Channel: Ora Trainings

General Article Links: 


General ERP and Related Domain Books:

(Caveat - I have not yet read these...still doing my discovery, categorizing, and stack-ranking...)
These are some of the books I think would be broadly of interest to others - and a few that I am particularly interested in reading.  As I often am involved in mentoring folks - I want to include materials that would be appropriate for beginner/junior - as well as senior - team members.

Domain Focus: ERP
Domain Focus: Supply Chain Management
  • Supply Chain Management For Dummies
    • KM:  I have found  quite a few of the "...For Dummies" series of books to often be excellent as a first-step in developing an initial awareness and orientation for a given domain. 

Domain Focus: Logistics & Transportation
Domain Focus: Inventory Management 
Domain Focus: Operations Management


These are slightly older ERP-related books, but, which  I think may have value in them as foundational  / orienting:



Friday, September 17, 2021

2021-09-17 Friday - Architecture as a Holistic Diagnostic Practice


(image source: geralt on

I spent a good portion of yesterday crafting a memo - focused on providing the leadership of a company with observations and suggestions on their business model. The feedback was very positive today.

Architecture - when done properly, encompasses so much more than just the technical/engineering aspects. Diagnosis (and treatment) - is a holistic effort - that considers the entire organization - across many dimensions.


Wednesday, September 15, 2021

2021-09-15 Wednesday - Book Review: Mastering Transformers

image source:


My post on LinkedIn, mentioning this review. 

My review posted to Amazon

Mastering Transformers: Build state-of-the-art models from scratch with advanced natural language processing techniques


Full Disclosure: Priyanka Mhatre (Digital Marketing Specialist, Packt) graciously invited me to review this book, before its publication - and provided me with a PDF copy).

It covers all the important topics from training BERT, GPT, other Transformer models from scratch, fine-tuning models on various tasks such as question answering, NER, classification, zero-shot classification.

-        Explore state-of-the-art NLP solutions with the Transformers library

-        Train a language model in any language with any transformer architecture

-        Fine-tune a pre-trained language model to perform several downstream tasks

-        Select the right framework for the training, evaluation, and production of an end-to-end solution

-        Get hands-on experience in using TensorBoard and Weights & Biases

-        Visualize the internal representation of transformer models for interpretability


My one word summary for this book: Fascinating.

A few other key words that come to mind to describe this book: Foundational, Hands-on, Practical, Crisp, Concise, Depth & Breadth, Tremendous Value.

With the continued accelerating explosion in the growth of unstructured data collected by enterprises in texts and documents – the need to be able to analyze and derive meaningful information is more critical than ever – and will be the competitive advantage that distinguishes future winners from losers in the marketplace of solutions.   This book is an investment in expanding your awareness of the techniques and capabilities that will help you navigate those challenges.

From the book: 

Transformer models have gained immense interest because of their effectiveness in all NLP tasks, from text classification to text generation….[and] effectively improve the performance of multilingual and multi-task NLP problems, as well as monolingual and single tasks.

This book is a practical guide to leveraging (and applying) some of the leading-edge concepts, algorithms, and libraries from the fields of Deep Learning (DL) and Natural Language Processing (NLP) to solve real-world problems – ranging from summarization to  question-answering.

In particular, this book will serve as a gentle guided tour of some of the important advances that have occurred (and continue to evolve) as  the transformer architecture gradually evolved into an attention-based encoder-decoder architecture.

What I particularly liked:

The deep subject-matter experience and credentials of the authors (“Savaş Yıldırım graduated from the Istanbul Technical University Department of Computer Engineering and holds a Ph.D. degree in Natural Language Processing (NLP). Currently, he is an associate professor at the Istanbul Bilgi University, Turkey, and is a visiting researcher at the Ryerson University, Canada. He is a proactive lecturer and researcher with more than 20 years of experience teaching courses on machine learning, deep learning, and NLP.”,  and “Meysam Asgari-Chenaghlu is an AI manager at Carbon Consulting and is also a Ph.D. candidate at the University of Tabriz.”)

The companion “Code In Action” YouTube channel playlist for the book,  and the GitHub repository with code examples.

The excellent quality/conciseness/crispness of the writing.

The extensive citation of relevant research papers – and references at the end of chapters.

The authors’ deep practical knowledge – and discussions – of the advantages and disadvantages of different approaches.

The exquisitely balanced need for technical depth in the details covered by a given chapter – with the need to maintain a steady pace of educating & keeping the reader engaged. Some books go too deep, and some stay too shallow. This book is exceptionally well balanced at just the right depth.

The exceptional variety of examples covered.

The quality of the illustrations used to convey complex concepts – Figures 1.19, 3.2, 3.3, 7.8, 9.3 are  just a few examples of the many good diagrams.

Chapter-1’s focus on getting the reader immediately involved in executing a hello-world example with Transformers. The overview of RNNs, FFNNs, LSTMs, and CNNs. An excellent overview of the developments in NLP over the last 10 years that led to the Tranformer architecture.

Chapter-2’s guidance on installing the required software – and the suggestion of Google Colab as an alternative to Anaconda.

Chapter-2’s coverage of community-provided models, benchmarks, TensorFlow, PyTorch, and Transformer - and running a simple Transformer from scratch.

Chapter-3’s coverage of BERT – as well as ALBERT, RoBERTa, and ELECTRA.

Chapter-4’s coverage of AR, GPT, BART, and NLG.

Chapter-5’s coverage of fine-tuning language models for text classification (e.g., for sentiment analysis, or multi-class classification).

Chapter-6’s coverage of NER and POS was of particular interest – given the effort that I had to expend last year doing my own deep-dive to prepare some recommendations for a client – I wish I had had this book then.

Chapter-7’s coverage of USE and SBERT, zero-shot learning with BART, and FLAIR.

Chapter-8’s discussion of efficient sparse parsers (Linformer, and BigBird) – as well as the techniques of distillation, pruning, and quantization – to make efficient models out of trained models. Chapter-8 may well be worth the price of the book, itself.

Chapter-9’s coverage of multilingual and cross-lingual language model training (and pretraining). I found the discussion of “Cross-lingual similarity tasks” (see p-278) to be particularly interesting.

Chapter-10’s coverage of Locust for load testing, fastAPI, and TensorFlow Extended (TFX) – as well as the serving of solutions in environments where CPU/GPU is available.

Chapter-11’s coverage of visualization with exBERT and BertViz – as well as the discussion on tracking model training with TensorBoard and W&B

The ”Other Books You May Enjoy” section at the end of the book (“Getting Started with Google BERT”, and “Mastering spaCy”)

Suggestions for the next edition:

The fonts used for the text in some figures (e.g., 3.8, 3.10, 3.12, 3.13, 3.14, 4.5, 4.6, 6.2, 6.7, 8.4, 8.6, 9.4, 9.5 ) appear to be a bit fuzzy in the PDF version of the book.  Compare those with the clarity of figure 6.6.


Table of Contents:

Section 1: Introduction – Recent Developments in the Field, Installations, and Hello World Applications

1 - From Bag-of-Words to the Transformer

2 - A Hands-On Introduction to the Subject

Section 2: Transformer Models – From Autoencoding to Autoregressive Models

3 - Autoencoding Language Models

4 - Autoregressive and Other Language Models

5 - Fine-Tuning Language Models for Text Classification

6 - Fine-Tuning Language Models for Token Classification

7 - Text Representation

Section 3: Advanced Topics

8 - Working with Efficient Transformers

9 - Cross-Lingual and Multilingual Language Modeling

10 - Serving Transformer Models

11 - Attention Visualization and Experiment Tracking



© 2001-2021 International Technology Ventures, Inc., All Rights Reserved.