- "Oxford Economics conducted a survey of more than 600 executives and 600 employees to better understand what works for employees—and what doesn’t—about open-plan layouts,"
- "... results show that threats to productivity and worker peace of mind are bigger issues than most executives realize..."
2018-07-30
2018-07-30 Monday - Open Offices - Bad for High Performance Employees
2018-07-28
2018-07-28 Saturday - A Gentle Rebuke to Martin Fowler, re: Software Architect
After watching this brief talk given in 2015 by Martin Fowler at the O'Reilly OSCON conference, I have a gentle rebuke:
First, let me say that I greatly admire Martin Fowler - and have told him so, in person, during a QCon in San Francisco some years ago - and that he is a great inspiration to my own personal growth in this field.
But, I do have a few questions - that I think are worth considering as a counter-point to his views expressed in this talk - which I do not think quite adequately consider the matter:
The implicit view that architects who do not code (as their primary activity, or no more) are of no value - and that the code is the essential truth of 'The Architecture' seems to me to be ignoring the following concerns:
1. While AS-IS may very well be expressed in code (to a great extent) - what of the TO-BE, which may well need to be charted and communicated over a multi-year arc of intent, effort, and funding by the business - in coordination with dozens, hundreds, or thousands of third-party partners?
2. What of communicating an understanding of those aspects of the design, which may need to be shared and communicated with third-parties - such as in the case of major (and complex) integration efforts? Will you just give those (potential competitors) the code?
3. What of communicating an understanding of those aspects of the overall system - some of which may not be implemented in your code - and which may rely upon other systems for which you do not have access to the source code (or which may be hosted by Cloud SaaS providers)?
4. What of communicating an understanding of those aspects of the overall system - some of which may not be implemented in code - and which may be implemented as manual processes - but are still critical elements to understanding the true scope of the overall system?
As Randy Shoup notes in his talk "Moving Fast at Scale" - "Sometimes we solve those problems with code" [emphasis, mine] - which begs the question, Martin: How do you document those things that are not in code, if not with diagrams?
5. What of the oversight and governance functions that should be tended to - as systems must be managed through a life-cycle - which must include a clear understanding of the architectural implications when sun-setting, replacement, or rationalization choices must be considered, researched, choices evaluated, recommendations prepared, etc. - are architects who spend their time coding - and not keeping a weather eye on these concerns - really making the best use of their time for the business they are serving?
6. Whether an architect codes (as their primary activity, or no more) - and whether there is value in an architect who no longer codes - seems to me to be primarily a question of scale. In a multi-billion dollar business - in which an architect may be responsible for the oversight of dozens of applications, within one or more given business lines (of which there may well be dozens) - they are typically required to closely monitor, align, and coordinate the architectural road maps that span multiple business lines - usually involving multiple organizational units, external third-party partners, and impacting potentially hundreds of applications - over multi-year staged delivery planning of capabilities. Is it really feasible (or optimal to the business) to have someone in such an architect role - being buried in the minutiae of coding?
7. Does the architect of a major commercial building, a skyscraper, a new aircraft, a bridge, or a new automobile design spend their time pouring foundations, laying bricks, running wiring through conduit, riveting sheet metal, on the assembly line, etc.? Would any professional architect worth their salt sneer at the very thought of being expected to document the design of such endeavors? Or would they simply say, pop the hood and disassemble the engine if you want to understand how it works? What of the systems that approach (or exceed) 1M+ lines of code - and which may exist in an ecosystem of a multitude of other systems that are of similar size and complexity? Is it feasible to require people to just read the code to understand one of those systems - or the potentially intricate interactions between them? Does your advice of a 'just a few diagrams' still hold?
For these reasons, we should be open to embracing - across a continuum of diversity of possible manifestations - the possible interpretations of the role of Software Architect (or even, Enterprise Architect) - and not be disparaging - and certainly not promote the use of such derisive terms as 'Architecture Astronaut'.
Meeks' Software Architecture Conjecture #1:
First, let me say that I greatly admire Martin Fowler - and have told him so, in person, during a QCon in San Francisco some years ago - and that he is a great inspiration to my own personal growth in this field.
But, I do have a few questions - that I think are worth considering as a counter-point to his views expressed in this talk - which I do not think quite adequately consider the matter:
1. While AS-IS may very well be expressed in code (to a great extent) - what of the TO-BE, which may well need to be charted and communicated over a multi-year arc of intent, effort, and funding by the business - in coordination with dozens, hundreds, or thousands of third-party partners?
2. What of communicating an understanding of those aspects of the design, which may need to be shared and communicated with third-parties - such as in the case of major (and complex) integration efforts? Will you just give those (potential competitors) the code?
3. What of communicating an understanding of those aspects of the overall system - some of which may not be implemented in your code - and which may rely upon other systems for which you do not have access to the source code (or which may be hosted by Cloud SaaS providers)?
4. What of communicating an understanding of those aspects of the overall system - some of which may not be implemented in code - and which may be implemented as manual processes - but are still critical elements to understanding the true scope of the overall system?
As Randy Shoup notes in his talk "Moving Fast at Scale" - "Sometimes we solve those problems with code" [emphasis, mine] - which begs the question, Martin: How do you document those things that are not in code, if not with diagrams?
5. What of the oversight and governance functions that should be tended to - as systems must be managed through a life-cycle - which must include a clear understanding of the architectural implications when sun-setting, replacement, or rationalization choices must be considered, researched, choices evaluated, recommendations prepared, etc. - are architects who spend their time coding - and not keeping a weather eye on these concerns - really making the best use of their time for the business they are serving?
6. Whether an architect codes (as their primary activity, or no more) - and whether there is value in an architect who no longer codes - seems to me to be primarily a question of scale. In a multi-billion dollar business - in which an architect may be responsible for the oversight of dozens of applications, within one or more given business lines (of which there may well be dozens) - they are typically required to closely monitor, align, and coordinate the architectural road maps that span multiple business lines - usually involving multiple organizational units, external third-party partners, and impacting potentially hundreds of applications - over multi-year staged delivery planning of capabilities. Is it really feasible (or optimal to the business) to have someone in such an architect role - being buried in the minutiae of coding?
7. Does the architect of a major commercial building, a skyscraper, a new aircraft, a bridge, or a new automobile design spend their time pouring foundations, laying bricks, running wiring through conduit, riveting sheet metal, on the assembly line, etc.? Would any professional architect worth their salt sneer at the very thought of being expected to document the design of such endeavors? Or would they simply say, pop the hood and disassemble the engine if you want to understand how it works? What of the systems that approach (or exceed) 1M+ lines of code - and which may exist in an ecosystem of a multitude of other systems that are of similar size and complexity? Is it feasible to require people to just read the code to understand one of those systems - or the potentially intricate interactions between them? Does your advice of a 'just a few diagrams' still hold?
For these reasons, we should be open to embracing - across a continuum of diversity of possible manifestations - the possible interpretations of the role of Software Architect (or even, Enterprise Architect) - and not be disparaging - and certainly not promote the use of such derisive terms as 'Architecture Astronaut'.
Meeks' Software Architecture Conjecture #1:
The source code may (or may not) be a full implementation of the desired capability needed by the business - but is more likely just an approximation (constrained by permitted time, allocated budget, and available skills/talent of the team involved). Therefore, it should not be confused with the actual or desired (or envisioned) design - that may require multiple years to achieve - of which the current source code may only reflect a partial (and incomplete, or inaccurate) representation.
2018-07-26
2018-07-26 Thursday - Resources for ADA Compliance Initiatives
I happened across this government 'Americans with Disabilities Act' (ADA) web site this evening - if you have an ADA initiative planned, this might be of interest to you:
ADA Best Practices Tool Kit for State and Local Governments
https://www.ada.gov/pcatoolkit/toolkitmain.htm
I also keep a list of additional resource links:
https://github.com/intltechventures/Lab.Architecture/tree/master/ADA
ADA Best Practices Tool Kit for State and Local Governments
https://www.ada.gov/pcatoolkit/toolkitmain.htm
I also keep a list of additional resource links:
https://github.com/intltechventures/Lab.Architecture/tree/master/ADA
2018-07-24
2018-07-24 Tuesday - Oracle JDK 11 Production Usage
https://dzone.com/articles/eliminating-java-update-confusion
"In a significant move by Oracle they have recently announced that, from JDK 11, the Oracle JDK binary will no longer be free for use in production. Developers will still be able to download Oracle JDK binaries and use them for development, testing and demonstrations without change. For use in a production environment, a support contract with Oracle will be required."
See pricing info in this FAQ
http://www.oracle.com/technetwork/java/javaseproducts/overview/javasesubscriptionfaq-4891443.html
2018-07-16
2018-07-15 Sunday - Orchestration vs Choreography
I've often been lax in my choice of using the terms Orchestration or Choreography.
This evening I found a useful explanation of the distinction between the two, in this Wikipedia article on BPEL:
https://en.wikipedia.org/wiki/Business_Process_Execution_Language
This evening I found a useful explanation of the distinction between the two, in this Wikipedia article on BPEL:
https://en.wikipedia.org/wiki/Business_Process_Execution_Language
"BPEL is an orchestration language, and not a choreography language. The primary difference between orchestration and choreography is executability and control. An orchestration specifies an executable process that involves message exchanges with other systems, such that the message exchange sequences are controlled by the orchestration designer. A choreography specifies a protocol for peer-to-peer interactions, defining, e.g., the legal sequences of messages exchanged with the purpose of guaranteeing interoperability. Such a protocol is not directly executable, as it allows many different realizations (processes that comply with it). A choreography can be realized by writing an orchestration (e.g., in the form of a BPEL process) for each peer involved in it. The orchestration and the choreography distinctions are based on analogies: orchestration refers to the central control (by the conductor) of the behavior of a distributed system (the orchestra consisting of many players), while choreography refers to a distributed system (the dancing team) which operates according to rules (the choreography) but without centralized control."
2018-07-07
2018-07-07 Saturday - CPU vs GPU for Machine Learning Performance
https://www.nextplatform.com/2017/10/13/new-optimizations-improve-deep-learning-frameworks-cpus/
"Intel has been reported to claim that processing in BigDL is “orders of magnitude faster than out-of-box open source Caffe, Torch, or TensorFlow on a single-node Xeon processor (i.e., comparable with mainstream GPU).”
2017-08-09
TensorFlow* Optimizations on Modern Intel® Architecture
https://software.intel.com/en-us/articles/tensorflow-optimizations-on-modern-intel-architecture
"TensorFlow benchmarks, with CPU optimizations added, see CPU performance gain as much as 72X"
A paper presented during the 2017 International Conference on Machine Learning (ICML)
- Deep Tensor Convolution on Multicores
- https://arxiv.org/abs/1611.06565
- "...Another important reason to look at CPUs is when batch size is 1, as may be the case in Reinforcement Learning, where it is not worthwhile to move data between CPU and GPU."
- "Deep convolutional neural networks (ConvNets) of 3-dimensional kernels allow
joint modeling of spatiotemporal features. These networks have improved
performance of video and volumetric image analysis, but have been limited in
size due to the low memory ceiling of GPU hardware. Existing CPU
implementations overcome this constraint but are impractically slow. Here we
extend and optimize the faster Winograd-class of convolutional algorithms to
the
N -dimensional case and specifically for CPU hardware. First, we remove the need to manually hand-craft algorithms by exploiting the relaxed constraints and cheap sparse access of CPU memory. Second, we maximize CPU utilization and multicore scalability by transforming data matrices to be cache-aware, integer multiples of AVX vector widths. Treating 2-dimensional ConvNets as a special (and the least beneficial) case of our approach, we demonstrate a 5 to 25-fold improvement in throughput compared to previous state-of-the-art."
2018-07-07 Saturday - Researching BigDL Machine Learning Framework
Preparing my reading list to do a deep dive in BigDL:
References:
References:
- https://github.com/intel-analytics/BigDL
- "BigDL is a distributed deep learning library for Apache Spark; with BigDL, users can write their deep learning applications as standard Spark programs, which can directly run on top of existing Spark or Hadoop clusters. To makes it easy to build Spark and BigDL applications, a high level Analytics Zoo is provided for end-to-end analytics + AI pipelines"
- https://bigdl-project.github.io/0.6.0/#getting-started/
- https://github.com/intel-analytics/BigDL-tutorials
- Step-by-step Deep Leaning Tutorials on Apache Spark using BigDL
Talks:
Articles:
- https://www.infoq.com/news/2017/01/bigdl-deep-learning-on-spark
- Intel Open-Sources BigDL, Distributed Deep Learning Library for Apache Spark
- https://dzone.com/articles/deep-learning-with-intels-bigdl-and-apache-spark
- Deep Learning With Intel's BigDL and Apache Spark
- Intel
- https\://software.intel.com/en-us/articles/bigdl-distributed-deep-learning-on-apache-spark
- BigDL: Distributed Deep Learning on Apache Spark
- AWS
- https://aws.amazon.com/blogs/machine-learning/running-bigdl-deep-learning-for-apache-spark-on-aws/
- Running BigDL, Deep Learning for Apache Spark, on AWS
- https://www.slideshare.net/AmazonWebServices/bigdl-image-recognition-using-apache-spark-with-bigdl-mcl358-reinvent-2017
- BigDL: Image Recognition Using Apache Spark with BigDL - MCL358 - re:Invent 2017
- https://cloud.google.com/blog/big-data/2018/04/using-bigdl-for-deep-learning-with-apache-spark-and-google-cloud-dataproc
- Using BigDL for deep learning with Apache Spark and Google Cloud Dataproc
- IBM
- https://medium.com/ibm-data-science-experience/using-bigdl-in-data-science-experience-for-deep-learning-on-spark-f1cf30ad6ca0
- Using BigDL in Data Science Experience for Deep Learning on Spark
- Lightbend
- https://developer.lightbend.com/blog/2017-06-22-bigdl-on-mesos/index.html
- Using Apache Spark with Intel BigDL on Mesosphere DC/OS
- "BigDL is a distributed deep learning library from Intel released and open-sourced in 2016. Besides offering most of the popular neural net topologies out of the box, BigDL boasts of extremely high performance through its usage of Intel MKL library for numerical computation. BigDL supports import/export of networks pre-trained in TensorFlow, Caffe or Torch and there are plans to include interoperability with other libraries in the market. However, the most important feature that made us look into BigDL is its native integration with Apache Spark that enables users to leverage the distributed processing capabilities of Spark infrastructure for training and model serving. For enterprises already using Spark, BigDL works just like any other Spark library that supplements the machine learning capabilities offered by Spark ML."
- "Lightbend Fast Data Platform (FDP) uses Mesosphere DC/OS as the cluster manager. FDP aims to make it easier for customers to productize the use of BigDL through its offering by integrating Spark as part of the platform. Machine learning applications developed using BigDL and Spark can also take advantage of the best-in-class streaming engines, the Lightbend Reactive Platform and messaging technologies like Kafka that form the complete suite of FDP."
- https://www.lightbend.com/products/fast-data-platform
- Microsoft
- https://blogs.technet.microsoft.com/machinelearning/2017/06/20/running-bigdl-apache-spark-deep-learning-library-on-microsoft-data-science-virtual-machine/
- Running BigDL Apache Spark Deep Learning Library on Microsoft Data Science Virtual Machine
- Cloudera
- https://blog.cloudera.com/blog/2017/09/deep-learning-with-intels-bigdl-and-apache-spark/
- Deep Learning with Intel’s BigDL and Apache Spark
- Parallel Universe Magazine
- https://software.intel.com/en-us/download/parallel-universe-magazine-issue-28-april-2017
- https://software.intel.com/sites/default/files/managed/31/c3/parallel-universe-issue-28.pdf
- BigDL: Optimized Deep Learning on Apache Spark*, by Jason Dai and Radhika Rangarajan, Page-57
- https://www.bluedata.com/blog/2017/09/deep-learning-with-bigdl-and-apache-spark-on-docker/
- Deep Learning with BigDL and Apache Spark on Docker
- https://rise.cs.berkeley.edu/blog/accelerating-deep-learning-training-with-bigdl-and-drizzle-on-apache-spark/
- Accelerating Deep Learning Training with BigDL and Drizzle on Apache Spark
- https://software.intel.com/en-us/articles/building-large-scale-image-feature-extraction-with-bigdl-at-jdcom
- "experience and lessons learned from Intel and JD teams in building a large-scale image feature extraction framework using deep learning on Apache Spark* and BigDL"
2018-07-07 Saturday - fastText for Text Classification
I'm doing some focused reading this weekend to investigate the relative performance of Machine Learning frameworks leveraging GPU vs CPU implementations - and whether there are cases in which a distributed CPU approach may have an advantage over a GPU approach.
This 2016 paper (using fastText, for text classification problems) by a Facebook AI Research (AIR) team (Armand Joulin, Eduourd Grave, Piotr Bojanowski, Thomas Mikolov) achieved some startling results that may be of interest to others.
Bag of Tricks for Efficient Text Classification
https://arxiv.org/abs/1607.01759
This 2016 paper (using fastText, for text classification problems) by a Facebook AI Research (AIR) team (Armand Joulin, Eduourd Grave, Piotr Bojanowski, Thomas Mikolov) achieved some startling results that may be of interest to others.
Bag of Tricks for Efficient Text Classification
https://arxiv.org/abs/1607.01759
"This paper explores a simple and efficient baseline for text classification. Our experiments show that our fast text classifier fastText is often on par with deep learning classifiers in terms of accuracy, and many orders of magnitude faster for training and evaluation. We can train fastText on more than one billion words in less than ten minutes using a standard multicore~CPU, and classify half a million sentences among~312K classes in less than a minute."
https://fasttext.cc/
Implementing Deep Learning Methods and Feature Engineering for Text Data: FastText
https://www.kdnuggets.com/2018/05/implementing-deep-learning-methods-feature-engineering-text-data-fasttext.html
"FastText is an open-source, free, lightweight library that allows users to learn text representations and text classifiers. It works on standard, generic hardware. Models can later be reduced in size to even fit on mobile devices."
https://www.kdnuggets.com/2018/05/implementing-deep-learning-methods-feature-engineering-text-data-fasttext.html
2018-07-07 Saturday - Exploring Scala.js
https://www.scala-js.org/
https://www.scala-js.org/tutorial/basic/
Writing client-side web applications in Scala
http://www.lihaoyi.com/hands-on-scala-js/
https://www.scala-js.org/tutorial/basic/
Writing client-side web applications in Scala
http://www.lihaoyi.com/hands-on-scala-js/
2018-07-02
2018-07-01 Sunday - Weekend Research and Experimentation
Spent quite a bit of time this weekend experimenting with the latest release of Docker (18.05.0-ce)
Upgraded to the latest version of Vagrant (2.1.2) - will spend some time in the coming week diving back into it.
I've also spent some time reviewing Charles Betz textbook that he's helped author, that he uses to teach a Masters' level course in IT Delivery:
Another programming language to add to my backlog of innovative ideas to explore: The Dark Programming Language:
These concepts are worth adding to your search filters for news about emerging trends in software development innovation:
Upgraded to the latest version of Vagrant (2.1.2) - will spend some time in the coming week diving back into it.
I've also spent some time reviewing Charles Betz textbook that he's helped author, that he uses to teach a Masters' level course in IT Delivery:
- http://pubs.opengroup.org/opengrouppress/managing-digital/
- Initial impressions: Well written, balanced.
An interesting quote, via Twitter, [re: Why Quantum Computing matters]...
"Modeling the caffeine molecule requires around 2^48 bits, which is around 1-10% of the # of atoms in the earth, but only 160 qbits" - @marco_pistoia (Master Inventor at IBM's T. J. Watson Research Center.) at #FOSD #QuantumComputing
These concepts are worth adding to your search filters for news about emerging trends in software development innovation:
- "neural program synthesis"
- "machine learning + code"
Subscribe to:
Posts (Atom)
Copyright
© 2001-2021 International Technology Ventures, Inc., All Rights Reserved.