Thoughts on “Confront Your Ignorance” apprenticeship pattern

This pattern seeks to solve the problem of skill gaps that are making daily work more challenging.  The authors propose that software apprentices suffering from this issue attempt to actively learn the missing pieces.  This could take different forms — they suggest reading tutorials/FAQs, constructing small low-stakes projects, and/or involving other people that are either experts in the area or trying to learn the same thing.  They also suggest keeping a list of these skill gap areas, and crossing them off as they’re sufficiently learned; this goes hand-in-hand with adding to the list as your learning exposes additional gaps.

This approach to learning really resonates with me.  My preference is to actively seek out knowledge, and I tend to learn best through hands-on practice.  I’ve already used a less formalized version of this when I taught myself Python: find a skill (in this case, a programming language) that I would like to learn and then give myself a project to work on that forces me to learn and use it.  There are three major additions (on top of the formalization) that I can take away from this pattern:

  1. Involve other people, whether they are experts or fellow learners.
  2. Don’t overuse this pattern to the point where it causes problems for others; I only have so much time.
  3. Balance learning with introspection.

The first point leads to the creation of a learning community, and extends both the resources and benefits of learning.  I know that I have a tendency to want to do everything myself, and while independence isn’t bad it’s also important to not always be reinventing the wheel.  I also enjoy sharing my knowledge, and it makes sense for me to seek that out in a more mutual way.

The second point is also something I run into often, and partially extends from the first.  I really like to build things from scratch and see how they’re made.  However, that tendency can also lead to excessive use of time and energy for what should be a simpler project, or the preference for my own solution over another (quite possibly better) one that’s already been written and vetted.

The third point encourages me to set up a cycle of learning and introspection; crossing items off of the list and then adding more to the bottom.

While actual checklists are not a tool I particularly enjoy using, this pattern has leaned me towards perhaps keeping one (and actually updating it).  That, I think, is where I’ve found the most value in Confront Your Ignorance.

Advertisements
Posted in Uncategorized | Tagged , , | Leave a comment

Sprint 1 Retrospective

In this sprint I learned how to set up a development environment for a fairly unfamiliar IDE and language, and I learned how to work with other people to work through issues in building a local copy of the AMPATH app.  I struggled most in the latter with actually understanding the error messages and how to resolve them.  I can take the difficulty I experienced with this process and use as a reminder in the future to more carefully pick through the error messages and try to resolve the conflicts myself.  I was concerned about making changes to files or folders, even though the worst outcome would just be reverting back a version.

I think the team worked pretty well together on this sprint, although it’s a little early to tell.  We were all working on the same thing (environment/app setup) which didn’t offer a lot of room for collaboration other than helping each other resolve problems.  I think that we’ll get a better feel for our team dynamic in this upcoming sprint when we have more “real” work to do.

This sprint was primarily focused on getting the AMPATH app up and running on my local systems.  I started by forking the repository from the group repo, then cloning it to my system.  I then looked through the README and tried to follow the instructions.  I must have done something incorrectly or in the wrong order, because I ran into missing dependencies (which should have been downloaded by npm).  I then tried a variety of solutions to resolve the problems.  I attempted to directly install the dependencies (didn’t work, due to a file/folder creation error).  I deleted and checked out the project again (didn’t work either, same issues as before).  I tried rolling back to older versions of Angular and npm, which was suggested by a couple of StackOverflow pages I found relating to similar errors  (did not work, bricked my Angular install, had to reinstall).  On the second or third try of just installing the dependencies (with a clean Angular install and project version), I managed to resolve the issue  but then ran into a problem with the ladda module — it wasn’t in the right place for Angular to find it.  I was stuck on that until Matt Foley found and posted a solution.

Once I had the solution to the ladda error, I reinstalled Angular again (just to be sure), worked in order through the steps outlined in the README, worked through Matt Foley’s fix, and wrote up the procedure I took as I went so that other people in my group and in the class could use it.

The prodecure boiled down to the following:

Run npm install, then install global dependencies, then npm install again to catch anything that might have been missed.  Create copies of one of the ladda files in the directory Angular wanted to look for it in, and modify one of the ladda files to point to the proper directory.

Posted in Uncategorized | Tagged , , | Leave a comment

Thoughts on “The White Belt” apprenticeship pattern

The White Belt pattern arises out of an issue I’ve encountered — I’ve learned one language fairly well and have some practice with others that follow similar paradigms, but have found it to be more challenging to learn new things.  Tools, skills, and languages that are different from what I know don’t come as naturally as it seems they should.

This apprenticeship pattern seeks to solve that problem through a mindstate and a more practical approach.  It teaches the mindset of a willingness to feel like a beginner again, to fail and maybe look foolish in order to adopt a childlike ability to absorb knowledge.  More practically, it suggests a learning strategy: adapt a simple, known software project from one language to another, using the paradigms of the new language to their fullest rather than trying to write “Fortran in any language”.

As I said before, I have notice that I struggle more and more to acquire new skills.  Whether that’s environment setup, picking up new languages, or adapting to a different set of tools it seems to get harder as I learn more.  That doesn’t bode well for my mental plasticity, and this pattern provides a useful solution.

The most useful aspect of the pattern, for me, are what the authors calls the mindset of not knowing.  Of willingness to ask questions, start from the beginning (whether that’s following tutorials that feel beneath me, or finding help elsewhere).  Of taking a harder road to learn things right and come up with a better and more nuanced solution rather than patching together something that’s familiar and comforting.  This speaks to something I learned in a high school gym class: we learn and grow the best and most successfully just a bit outside of our comfort zones.  I need to push myself into the “learning zone”.

The action advice is also something that I can take away and use moving forward.  While I don’t have a lot of time now, I am interested in refactoring old projects into new languages as a learning exercise, to blend something that I know well already with something brand-new.

Posted in Uncategorized | Tagged , , | Leave a comment

Thoughts on “Apprenticeship Patterns, Chapter 1”

In this blog post, I want to go through Hoover and Oshineye’s values of software craftsmanship, and how I can apply them to my work and my development as a developer.  To summarize quickly, they are:

  • Talent derives from effort and practice.
  • Adapt and change.
  • Embrace pragmatism.
  • Share knowledge.
  • Experiment, and be willing to fail.
  • Take control of own destiny.
  • Focus on individuals.
  • Include the best ideas from everyone.
  • Focus on skills over processes.
  • Find like-minded people to learn with and from.

I don’t necessarily agree that all talent comes from practice and effort, but practice and effort are the cornerstones of ability; talent certainly languishes without use.  I also don’t think that it’s enough to just do my work for classes or my job.  Improvement requires practice on new things and dedication of time outside of the minimum needed to put food on the table.

Like any machine learning system, we can improve via feedback.  Unfortunately, it’s not always so easy as minimizing a loss function and backpropogating information.  I agree with the authors wholeheartedly on the need to find solutions to my inadequacies.

Pragmatism is great, but I don’t agree that it’s the end-all, be-all; I do appreciate, however, that it’s more important in the context of apprenticeship than it might be later in my life.  I recognize the tendency within myself to become paralyzed by attempts to force “theoretical purity” or “future perfection” from the start, and then never start at all.  This is something I very much want to change, or at least to recognize when it is useful versus times it gets in the way.

One of the things I enjoy most about being a part of a learning community is the ability to share what I know, and learn from my peers.  I will miss that after I graduate, so I may try to bring that kind of sharing to wherever I end up next.

I absolutely agree that progress in any form requires experimentation and failure.  From my experience playing various games, I know that it’s important to take something away from each loss, and also to acknowledge the room for improvement in each win.

I think that many of Hoover’s and Oshineye’s points follow from taking control of one’s destiny.  I agree that conscious choice and involvement in learning and life are key to the improvement of self.

I like the idea of a community of like-minded individuals rather than a blind reliance on hierarchy.  In my gaming life, I’ve found that (sometimes somewhat heated) debate is an excellent way to grow myself and to learn.

Inclusiveness goes hand-in-hand with pragmatism.  If an idea is useful or productive, adopt it.  If a person has value to contribute or simply wishes to share in the learning and information, include them.

My earlier education was heavily focused on skills over memorization, and that different processes may work better or worse for different individuals.  I agree that it’s important to recognize these things, as well as acknowledge that skill is not always equitable.

Learning is communual, and this last value of “situated learning” seems to follow naturally from many of the earlier ones.  I want to continue to align myself with like-minded individuals, even if that requires more effort than just showing up to class.

There is more to unpack in this chapter than what I have covered here, far more than what I could digest and boil down for a single blog post.  I believe that commitment to the values Hoover and Oshineye lay out will help me improve as a software craftsman.

Posted in Uncategorized | Tagged , | Leave a comment

Introduction for CS 448

This is the blog I will be using for this course.

Posted in Uncategorized | Tagged , | Leave a comment

Thoughts on “Hybrid Verification: Mixing Formal Methods and Testing”

This article, by Dr. Ben Brosgol, focuses on a mixture of formal methods and testing practices (together called “hybrid verification”) and the use of “contracts” that consist of preconditions and postconditions in order to formalize the assumptions made by critical code.

I chose to write about this article because it highlights some of the limits of testing, shows how to provide additional security for critical code, and introduces contact-based programming.

Brosgol defines a “contract” in programming as a set of preconditions and postconditions that wrap a subprogram (function, method, etc.) call.  The subprogram cannot begin unless its preconditions are met, and cannot return until its posconditions are true.  This provides a contract between a program and its subprograms that guarantees a certain state at critical times.  There are tools written for some languages (he uses SPARK as an example) that can do both static and dynamic contract testing and provide proof that the code will work as specified.

Brosgol then details ways to mix testing and formal verification.  If formally verified code calls subprograms that were tested rather than proven, the formal analysis tools will attempt to show that the preconditions are met, and assume that the tested code satisfies the postconditions.  This also requires that either the contracts are checked at runtime, or that sufficient testing was done such that the developer is confident the contracts will be fulfilled by the tested code.  If formally verified code is called from tested code, there need to be runtime checks for the preconditions (because the tested code does not guarantee those in the way the formal verification requires), but because the postconditions have been proven there is no need for checks at that point.

Next, Brosgol mentions the need for good choice of postconditions.  Strong, extensive postconditions make it easier to provide proof, but may have unacceptable overhead if they need to be checked dynamically.

He concludes that the relatively new combination of formal proof tools and contract verification on both a static and a dynamic basis opens up new avenues to create code that secures its critical sections.

This article helped me to understand that there’s a much wider world of testing beyond what we covered in class.  We didn’t talk about proof-based testing at all, and that’s a subject area that I believe I should learn more about.  It also highlights the way that our understanding of testing is ever-expanding.

 

Posted in Uncategorized | Tagged , , | Leave a comment

Thoughts on “Getting Started with AI for Testing”

In my last post, I wrote about an article that dove into the uses of AI in software testing.  Given the volume of search engine results that turned up when I started doing some research into the subject area, I thought it was worthwhile to write another piece about it.

The post I chose to write about this time is an introduction to AIST – Artificial Intelligence for Software Testing.  It is defined by Tariq King (the author of the post) as “an emerging field aimed at the development of AI systems to test software, methods to test AI systems, and ultimately designing software that is capable of self-testing and self-healing.”  Most intrigueing to me is the last part — self-healing software.

The organization hosting this blog (of which King is a founding member) is called AISTA, or the Artificial Intelligence for Software Testing Association.  Their mission is to pursue what they call the “Grand Dream” of testing: software that tests and updates itself with little need for human intervention.

King’s post is more of a survey than an in-depth piece.  He identifies three areas to explore when looking to get into AIST: artificial intelligence, software testing, and self-managing systems.  I know a little about the first two, but the third I haven’t touched on much.  Self-managing systems also appear to be the focus of AISTA.  King claims that there is “a general lack of research in the area of self-testable autonomic software”, but that recent technological developments appear to bring solutions closer practicality.

Ultimately, self-managing and self-healing systems are designed to adapt to their environment, modeled (originally by IBM) after the autonomatic nervous system in living creatures.  A self-healing system should be able to maintain homeostasis alongside self-optimization.  And that necessitates self-testing: before making changes to its own code, an autonomous system needs to ensure the change won’t do more harm than good.

So, what does a world of self-testing software mean for software testers?  It means that we may become more like teachers for software systems, moving them out of local pitfalls so that they can continue to grow.  Of course, these systems may be a long way off, and will need extensive human-driven testing and validation before they can start to test themselves.

The robots aren’t coming to take software testing jobs.  Yet.

Posted in Uncategorized | Tagged , , | Leave a comment