Developers in New York City by Zip Code
In 2016, after reading a Dice Insight article, I downloaded data that had technology professional numbers by zip code, along with density. A recent NY Times article How Big Tech Is Turning New York Into a Silicon Valley Rival
prompted me to resurrect the data, decorate it a bit with neighborhood names, and then import it into Google Maps
, which was surprisingly easy.
The original data is available in Excel
Hofstede Dimensions and Coffee Consumption
Responding to a Treehugger article, Why Americans will never love tea as much as coffee
, I initially wrote my personal preferences for tea and coffee
, ending with, BTW, this has just given me an idea for comparing Hofstede's cultural dimensions and coffee and tea consumption.
Afterward, I did some analysis in Excel, then ran the same processes in R using Visual Studio, then converted that to a Jupyter Notebook on Microsoft's Azure Notebooks
Although this analysis is limited to 45 countries that have Hofstede's Cultural Dimensions
, as well as per capita consumption for both coffee and tea, it would seem that coffee consumption correlates with power distance, individuality, and masculinity. Tea had small correlations with the dimensions and sometimes in the same direction as coffee. A fuller analysis is available on Microsoft's Azure Notebook
, but some quick findings:
Higher power distance, lower coffee consumption: -.63
Higher individuality, higher coffee consumption: .61
Higher masculinity, lower coffee consumption: -.41
Limiting one's analysis to 21 highly-developed OECD countries, excluding Japan and Korea to reduce some effects of culture and economics, only the masculinity dimension retains its high inverse correlation with coffee consumption. Exploring another dimension within this set, the degree of Protestantism has an equally strong positive correlation, and the two dimensions are also strongly inversely correlated.
Higher masculinity, lower coffee consumption: -.60, p-value=.003831
Higher Protestantism, higher coffee consumption: .61, p-value=.003642
Higher Protestantism, lower masculinity: -.60, p-value=.003895
In general, one might say that collectivist, hierarchical, gender-traditional countries drink less coffee, but that might have more to do with history than an actual relationship, in that the older empires are both more traditional, while later developed countries are both highly Protestant and stronger consumers of coffee, and this latter aspect having to do with trade. As for individuals, I don't know it means much, as a real analysis covering many people might yield some insights on personality, rather than culture.
James Igoe's Reviews > Weak Links
by Peter Csermely
My rating: 4 of 5 stars
Many Great Ideas, A Bit Loose
Most of this book is rich with ideas on networks, often highly theoretical, but the presentation and possibilities at times hit a a wrong or offensive chord. You may fee similarly, but
I would still heartily recommend this book to any one lay reading in systems and networks.
View all my reviews
VBA versus .NET
I was recently messaged by someone on LinkedIn, and since my response seemed full enough, I thought I'd share.
I see that you also program in VBA but you have made the jump to .NET. Unfortunately, I have found C#/Excel coding to be quite slow and just wanted to hear about your experiences.
Slow? It depends on what you mean. Honest, I have had to make the pitch when building apps that it should be in .NET rather than VBA for speed. One particular app had a form that needed to fill about 20 dropdowns on load, so using async operations were essential. That same app, while executing one SQL statement in the foreground, also executed 2 background statements that filled panels. It wouldn't have performed well if done in VBA.
If you mean that it takes longer, then yes, but that is a necessity for good code anyway. If you only need a local operation, non-threaded, that doesn't need to be used across the enterprise, VBA can make sense, but with .NET comes numerous benefits that save time and energy. As an example, in a C# solution, the design will likely be model-view-presenter (MVP), that will have a class project, a unit test project, and an Excel UI view project. This can take longer, but there is the benefit of separation of concerns (SoC), and with SoC you can add unit testing. The .NET solution will also be publishable as a desktop application that does not require admin rights, with automatic updates across the enterprise and integrated with source control, in my case TFS/VSTS (now Azure DevOps). Also, if you build libraries for reuse, coding C# become quicker as you can reuse components, but with the benefit of not copy-pasting code around. .NET benefits can be huge if you use them.
BTW, I can still use VBA for prototyping, and since I can simulate threading in VBA, and often work in a class-based model, my designs usually easily transfer to .NET. And yes, sometimes VBA is just the better choice. In other cases, VBA is the choice for rapid development. One solution, while the front-end was VBA and class-based, it extracted data to XML and then used SSIS to process and load files in SQL Server, that then used SP's to turn the code into a cube. There was always the possibility that that VBA solution would be converted to C#, provided the work needed to be better productionized.
A Journey — if You Dare — Into the Minds of Silicon Valley Programmers
My responses in a NY Times comment section for the book, Coders: The Making of a New Tribe and the Remaking of the World
by Clive Thompson
#1 - Link
Although I've been a software developer for 15 years, and for longer alternating between a project manager, team lead, or analyst, mostly in finance, and now with a cancer center, I found it funny that you blame the people doing the coding for not seeing the harm it could cause. First, most scientific advancement has dark elements, and it is usually not the science but how it is used and sold by business people that is the problem. This leads to the second problem, in that it is not coding that is in itself problematic, but specifically how technology is harnessed to sell. It is normal and desirable to track users, to log actions, to collect telemetry, so as to monitor systems, respond to errors, and to develop new features, but that normal engineering practice has been used to surveil users for the purpose of selling. Blaming coders for this turn is like blaming them for the 90s internet bubble. As it is now, it is a rush for profits, not the technology, that is a problem. Many famous historical innovations were driven over the edge by corruption and wealth, not by the people involved in designing and building the systems, although we are so far along in commercialization that it is now part of many developers roles to further the business model.
#2 - Link
@Ben - I completely agree. Most scientific advancement has dark elements, and it is usually not the science but how it is used and sold by business people that is the problem. This leads to the second problem, in that it is not coding that is in itself problematic, but specifically how technology is harnessed to sell.
As for its depictions of coders, I imagine many people not in tech think of coders as young bros' that make apps at cool companies, while in reality, the average is a 38-year old married male with 2 children that makes intranet website and desktop applications for mainstream businesses.
The art aspect if funny, as it is more about soft decision making where one has to weigh the value of one architecture over another, the viability of technology in the future, the ability of coworkers to support and understand the work, the aesthetics and usability of a website. All of these decisions can be made analytically or quantitatively, but more often than not, it is one's sense based on reason and experience.
As for the bug, yes, a missing character might be a problem if I was writing COBOL in 1982 (real story). Nowadays code checking built into IDEs immediately flag such errors, before compilation.
#3 - Link
@Eugene - 10x itself has gained mythical status, but mostly as a misunderstanding, then maybe usurped by an absurd culture of competition.
1 - “In one of their studies, Sackman, Erikson, and Grant were measuring performances of a group of experienced programmers. Within just this group, the ratios between best and worst performances averaged about 10:1 on productivity measurements and an amazing 5:1 on program speed and space measurements!” – The Mythical Man-Month
2 - The original study that found huge variations in individual programming productivity was conducted in the late 1960s by Sackman, Erikson, and Grant (1968). They studied professional programmers with an average of 7 years’ experience and found that the ratio of initial coding time between the best and worst programmers was about 20 to 1; the ratio of debugging times over 25 to 1; of program size 5 to 1; and of program execution speed about 10 to 1. They found no relationship between a programmer’s amount of experience and code quality or productivity.
#4 - Link
That is one aspect, but behind that is lots of reading, often more articles than books, but covering design (UI/UX), patterns/architecture (GOF), algorithms, database design, best practices, management, social sciences, and operations (Deming to agile). All of this informs the decisions I make when building something for an employer. Granted I am fairly bright, so would have acquired knowledge regardless, but to define coders as simply working with syntax is demeaning. Even people that are deficient in the broader sense of the world can be deeply knowledgeable in their respective domains.
Plugin development for Excel
This is a response to an old question on Stack Overflow
As mentioned by others, there are three basic Office technologies other than XLL's, they are VSTO, VBA, and Office JS API.
My personal experience having worked with all three, is that VSTO is the most powerful, in either VB.NET or C#, as they are essentially the same language. The future road map for the two languages will show divergence because C# will receive higher-end features while VB.NET will be targeted as the easy to use one, but as this point, there is little difference between the two for programming Excel. VSTO will provide built-in processes for versioning, release, automatic updates, and rollback, and is capable of anything within the .NET library.
VBA is the original programming language for Office and most samples are based on that. You can create fairly complex ribbons and context menus with it, but that said, it will be incapable of async/threaded operations and is lightweight for service related work. That said, if you don't need such operations, VBA can work, but you should have some plan for managing versioning and code management, which is not natively part of the VBA sphere, and will be entirely managed by you.
The office API is like programming web pages with Office, defining and using JS for operations - newer ones leverage React and Angular - with HTML and CSS for the panels. My recent experience converting Outlook VSTO add-ins was frustrating as many easy-to-implement features in VSTO/VBA are not available or are more complicated in JS. The experience and the interface though is quite nice, are much better looking than the typical WinForm, and it will be capable of working in web-based office clients, unlike VSTO.
The XLL link you provided is for a wrapper around C++. This is likely more complicated than any of the other types, and although there is power in going lower, you would need to have the experience and skills to make it worthwhile.
Desktop: All (VSTO, VBA, JS)
Web/Mobile: JS only
Easy Upgrade and Code Management: VSTO, JS
UI: JS is better looking, VSTO/VBA is WinForm looking
Skills: HTML/CSS/JS (Web) vs VBA/WinForms vs VSTO/WinForms (C#/VB.NET)
Examples: VBA, VSTO, JS, in decreasing order
4 Tips for Surviving a Toxic Workplace Culture
In retrospect, when I've decided to leave an organization, I haven't documented everything, but I can see from others how that might matter.
For myself, I've focused on the following, very much the same as the article
Refining and developing skills
Being one's best, aiming to shine while searching for an exit
Connecting with everyone with which you have a positive relationship
Stuck and Stressed: The Health Costs of Traffic
I responded to an article in the NY Times, Stuck and Stressed: The Health Costs of Traffic
, several times, and the following is relevant to technology::
The ability to work remotely can be a big enhancement, particularly for those with long commutes. It's not for everyone, but it can greatly enhance productivity and feeling of well-being. I recently reposted an article touting how tech employees are both more productive and more satisfied when they have the remote work option. Granted, this is not always an choice for roles that require face time and presence, but for many - currently, I am spending 60% of my time coding - it could personally reduce the need for transportation and clothing costs, and for companies it could reduce turnover and site costs - people can share desk space - as well as make some more productive and happier.
Granted, it is not for everyone. A study looking at personality traits conducive to remote work found that autonomy and stability were associated with having the least stress working from home. I'm sure there are other views, differences dependent on the level of extroversion, tech-savvy, and independence, etc., as well as aspects of the physical environment, but regardless, it would certainly go long way to reduce problems of commuting.
Organize your life with Personal Kanban
Responding to an article in TreeHugger
If one manages software processes, the same inclination is carried over into one's own life. One can't help but think of applying Kanban to one's personal tasks. That said, efficiency is not always a virtue. I've read that people are happier multi-tasking, so striving for efficiency is not necessarily pleasurable.
My personal code and sites are managed in VSTS, while my work environments have used a variety of systems, most recently Jira. Although I have explored a variety of Kanban systems to manage my life, I also settled on Trello, owing to its cost and ease of use. That said, I more often manage tasks as simple lists either in Wunderlist, Instapaper, email - I find my inbox a very effective way of tracking diverse items - or OneNote. As was always the case with lists, they can become unwieldy and outdated, so they require periodic review, deletion, and reorganization.
James Igoe's Reviews > Godel's Proof
by Ernest Nagel
My rating: 4 of 5 stars
Interesting and not terribly written but in a few areas highly repetitive. My background is not deeply mathematical, so maybe I was missing a subtlety here and there, but it seemed to be stating the same meanings over and over, although the bulk of the book was engaging and thought-provoking for someone 'mathy' like myself...
View all my reviews
James Igoe's Reviews > Thinking Architecturally
by Nathaniel Schutta
My rating: 4 of 5 stars
An overview of architectural decisions, the politics and persuasion involved, and the needs to balance competing measures and attributes. A fairly easy read, but full of great suggestions, and, for many, reminders of how to handle being a senior developer or architect.
View all my reviews
Complexity: A Guided Tour
Complexity: A Guided Tour
My rating: 4 of 5 stars
I enjoy reading in systems and complexity, and this was a nice addition to my shelf, with a slightly different take than other books. I found a few areas in the first half a bit tedious, overly long, repetitive, and not illuminating, but generally, it's a great overview of seminal work and very thought-provoking. The first half overlaps but nicely differs from other books I've read, covering things like chaos and information processing, and the latter half of the book I found more engaging, focused on models, computation, network science, and scaling. As mentioned, although I found the first half a bit of a slog at times, the second half was very engaging.
View all my reviews
Review - TFS/VSTS - Great Product, Ideal for Small Development Shops
This is a report a short review I provided for G2 regarding TFS
What do you like best?
If you use Visual Studio for development, TFS, or its online equivalent VSTS, you can have a fairly seamless end-to-end integration. Out of the box, it provides code management, testing, work hierarchy in agile formats, automated build, and deployment.
What do you dislike?
Branching and merging can be a bit painful, in that it needs to be planned, and is not natively part of the process. Code review also needs to be planned and only recently has it become part of the process.
Recommendations to others considering the product
My only concern regarding TFS and VSTS is that Microsoft itself recommends using Git.
What business problems are you solving with the product? What benefits have you realized?
In my current role, I've joined a shop that has application development as secondary to their role of desktop OS and app deployment/maintenance, so their code management practices are minimal. I am working towards getting all of their code into TFS, converting much of it to newer technologies, and using TFS to automate the process of build and deployment, although the near-term target is continuous integration.
Patents & Innovation
Over dinner, a friend mentioned that she thought a particular country produced the most patents, and although I remember reading the same article about 10 years ago, - I believe it was in the NY Times - it is no longer true if it ever was. Looking at patents per capita, I found a variety of articles based on quality sources, and although the country does not rank in the top 10, it does rank well in Bloomberg's Innovation Index.
The latter is not solely based on patent numbers since one needs to consider other measures of innovation. Bloomberg's scoring includes indicators such as R&D spending, manufacturing, the number of high-tech companies, secondary education attainment, and the number of research personnel.
On a separate note, countries with large engineering and semiconductor industries and those that score well in international comparisons on science and math will dominate patents and innovation, as well as those countries with freer cultures, although this is synergistic, in that both the industries and social capital measures feed each other.
Some of my own informal research into Hofstede's cultural dimensions and patent production found that the two (2) dimensions with the highest correlations and P-values under .01 were Uncertainty Avoidance and Individuality. Essentially, cultures that tolerate ambiguity and are the least rule-based, along with having high individuality, produce a larger number of patents.
Because of the high tech industries they support, their high levels of education, and their generally free culture, Scandinavia performs well. It is similarly so for South Korea and Japan, although they generally do not have what we would think of as free cultures, being much more rigid and rule-based, they do have very high levels of technical education and industries that rely on those skills.
The Low Probability of Hiring Software Engineers
A fairly complicated description of hiring, and although somewhat obvious, more easily described by a simple probability equation. So, excluding the likelihood of getting past the recruiter:
P(hire) = P(phone screen) * P(sample project) * P(2 interview teams) * P(accepting)
Even including some kind of Bayesian inference, increasing odds for passing subsequent steps, or tilting candidate characteristics, it still leaves the probability of a hire fairly low, and an increased likelihood of a rejecting a good candidate, a false negative, but one can understand the aversion to a false positive, as it can be very expensive.
Source: Bayesian Inference for Hiring Engineers
AI in Software Development
Even before AI, I would have thought that work being done now would be automated, and of course, AI will replace some work - since developers automate tasks themselves, using rules, patterns, and processes - but the idea is always to stay ahead of the 'crushing wave' of new tech, often automating oneself out fo a job, thereby keeping your job...
BTW, this mentions interesting tools leveraging AI to help coders, rather than simply replacing them, since the latter is not currently a realistic scenario.
Source: Will A.I. Take Over Your Programming Job?
How do you deal with making sure your use of new technology is correct and free from code-smells?
Responding to How do you deal with making sure your use of new technology is correct and free from code-smells, security issues, etc.?
Issues can be dealt with in several ways.
Understanding what makes high-quality, maintainable code would be first, so knowledge of best practices regarding OOP, SOLID, design patterns, API design, etc. is important. Depending on what you mean by security, best practices in those regarding transfer protocols, coding styles, validation, storage, etc. are equally something one can learn.
Planning your work is useful, as a well thought out design is easier to implement, or at least will avoid future problems, than when you are just 'winging it'. Diagramming and project plans can be useful at this stage. Self-management is part of this, so using boards and epic/stories/tasks to track work is important, and there are free tools like Visual Studio Team Services (VSTS) or Trello to help.
Requirements gathering will matter so documentation and communication with users and/or clients will make a huge difference. Also, usability matters, so both understanding how to build code for others, whether it is a UI or an API will be important, to keep your clients happy and to avoid rework. With a UI, mockups can be useful, so using Balsamiq or Viso to put together the basics can be a starting point for discussing with users.
It would depend on your code stack. I work primarily in the Microsoft stack, so there are maintenance tools built into Visual Studio (VS) to check for code quality/maintainability and for code clones. Purchasing licenses for products like ReSharper can help. As part of the automated build process, VSTS has components for testing, code quality (Resharper), and build quality, executed on check-in
Independent of the stack, using TDD or unit tests are important, besides saving you time and effort. As an independent, it's tough to work in pairs, but code review can be useful, so enlisting someone to review your work can help.
Do Algorithms Make You a Better Developer?
Responding to a question on HashNode, Developers who practise algorithms are better at software development than people who just do development. Is it true?
, I wrote the following:
My feeling is that algorithms help make one a better programmer, but that is likely true of many coding concepts. I did not have algorithms as an undergraduate, so my knowledge is acquired through reading and practice, but after reading and applying Algorithm's in a Nutshell, I felt the quality of my work improved. That said, my development work increased more after understanding Design Patterns, or after consuming books on database design.
Since many types of knowledge improve developing and architecting abilities, one has to consider how it helps and to what degree. Algorithms are coding-in-the-small, often narrowly focused solutions, but which can have a great impact at scale. For many applications, a focus on algorithms would be overkill as data sets and requirements do not require it. In this context, any middling programmer can optimize a basic loop for performance. Proper database design, either relational or OLAP/OLTP, will make your applications better, from both maintenance and performance perspectives. Object-oriented programming makes some type of designs better, those that add objects, while learning correct functional programming helps in contexts where you are increasing functions on a limited number of objects. Learning enterprise architecture helps in the design of large scale operations.
One could equally argue that learning and practicing self-management, communication skills, and code management all make for better programmers, and they do. Ultimately, learning makes one a better developer.
My Self-Development in 2017
My corporate annual review period recently passed, and I was reminded of all the skills developed and completed tasks over the past year, both in and out of work. Sincerely, remembering what I've done over the past year makes me feel good, and really reminds me of how much I enjoy learning.
Although largely focused on reading to learn, I do partake of various streaming video resources via Pluralsight. The courses I've completed this past year:
Multiple courses on management and leadership
Quantitative & AI-related courses, accompanied by work in R, Python, or VBA
Work and Periphery
To avoid repeating myself, here are My Most Popular Posts of 2017
Review: Complex Adaptive Systems: An Introduction to Computational Models of Social Life
Complex Adaptive Systems: An Introduction to Computational Models of Social Life
John H. Miller
My rating: 4 of 5 stars
A thought-provoking introductory exploration to modeling social systems, covering ideas for rule-based agents within a variety of rule-based systems, moving onto evolutionary-like automata and organization of agents to solve problems. Underlying some of the ideas, one could see references to deeper concepts, e.g., nonlinearity, attractors, emergence, and complexity, none of which was explained explicitly. At times, I did find the writing tedious, as some ideas were too obvious to spend time detailing, but overall, a well-written easy to digest text.
View all my reviews
My Most Popular Posts of 2017
Although I've made many posts on my Data Analytics Workouts site in the past year or two, some generated more interest than others - nothing here was virally popular - so I've written a post listing the most popular ones. Here is a link to the post.
Anyone that knows me that I read a great deal, and one of the topics I focus on is management and leadership. It has meant attending B-school, reading books on management, as well as reading numerous articles and studies - I definitely prefer to base my ideas on statistical proof - so I think I have a good sense of what research says excellent management and leadership means. After reading a blog post that resonated with me, but I thought overly-specific, I decided to abstract that article's rules into something generic, add some needed items, then convert those items into practice.
- Making sure one's team has adequate tools, resources, contacts, and training
- Being a leader, and in that providing vision, expectations, goals, and standards, as well communicating that clearly
- In one's self, exemplifying excellence, being a role model, maintaining a positive image, having personality and charm, while earning respect
- In one's team, having excellence, cohesion, friendship, and camaraderie
- Developing one's people, having a concern for their welfare, providing praise and encouragement, and listening
- For the business, service, strategic goal-setting, clear communication, protecting the team, improving efficiency, managing requirements and resources
The only issue is that this list is a bit of a 'kitchen-sink-laundry-list' including everything without concern for the appropriateness. When I look through my history, very few managers have been what I saw as truly excellent. For other items, they were not specifically a manager's duty but were provided by the organization, such as with providing training.
How to Tell If You're a Great Manager
- Do I know what is expected of me at work?
- Do I have the materials and equipment I need to do my work right?
- At work, do I have the opportunity to do what I do best every day?
- In the last seven days, have I received recognition or praise for doing good work?
- Does my supervisor, or someone at work, seem to care about me as a person?
- Is there someone at work who encourages my development?
- At work, do my opinions seem to count?
- Does the mission/purpose of my company make me feel my job is important?
- Are my co-workers committed to doing high-quality work?
- Do I have a best friend at work?
- In the last six months, has someone at work talked to me about my progress?
- This last year, have I had opportunities at work to learn and grow?
Software engineers will be obsolete by 2060
In response to an article on Medium, Software engineers will be obsolete by 2060
, I responded with the following
Interesting article on The Economist titled Automation on Automation Angst, http://www.economist.com/node/21661017, that looks at several publications that look at the historical effects of automation, and although there is always a fear of being replaced, ultimately more jobs are created than destroyed. Software engineers disappear? So what! There will be other jobs, with different titles, and in the interim, the more people use tech, the more there will be a need for software engineers.
Because of this, a person asked for my opinion on maintaining their career as a .NET developer, to which I responded
Although I am a .NET developer as well, I focus on expanding my project management and leadership skills, as well as developing skills in AI/ML. Rather than bore with all the details of my background, here is what I think:
Somewhat more generally:
- You should develop your skills in AI/ML, if only by familiarizing yourself with TensorFlow and CNTK. Even if you are not the expert, you should understand it.
- Develop your leadership and management skills, if only to become a better developer, even if you don’t aspire to a higher rank, since having those skills will make one a more hireable developer.
- Although the landscape might change, it is unlikely that you could not find a job with .NET skills in 5 to 6 years, but the most important thing is to keep in touch with the changes, developing as needed.
- Consider what would put you out of a job, say automation that builds the things you already do and do that. Stay ahead of the wave that is going to crush you, by becoming part of the wave itself.
- Create an online presence via blogs, codeshares, NuGet and GitHub repo’s, contributions to other projects, and career sites. Make recruiters come to you.
Using Visual Studio Team Services for Personal Development
The complete post is published on my Data Analytics blog as Using Visual Studio Team Services for Personal Development
, the following just a quick introduction.
Microsoft provides free access to its online Visual Studio Team Services (VSTS)
, and for some time I've been using the service, I've wanted to restructure my code hierarchy, and recent changes in my work environment, automated build and deployment using Octopus, nudged me to finally take the task on, so in the past few weeks I’ve:
- Restructured my Code library into one big project with sub-projects for Development, Websites, and Work
- Developed my Work hierarchy of Epics, Features, Stories and Tasks, along with queries and sprint boards
- Automated all of my builds via check-in, adding extensions to evaluate code and build quality
- Developed a dashboard to oversee the status of work
Pinterest as a Publication Channel for Data Analytics
More as an experiment, rather an attempt at sharing code and ideas, I created a Pinterest board
devoted to my personal data analytics work, done with Python, R, or F#, as well as reviews of books, and was quite surprised with the result.
The graphics could do with optimization, but otherwise...
Deep Learning and Toolkits
As part of reading Fundamentals of Deep Learning: Designing Next-Generation Machine Intelligence Algorithms by Nikhil Buduma
, I was expecting to work through the code examples with my own data, and for the library, it recommended TensorFlow
, which brings up competing alternatives, a primary one being Microsoft Cognitive Toolkit
. Over the next few weeks, I will start exploring both in Python, as well as publishing some of the related work.
A minor note, the darksigma/Fundamentals-of-Deep-Learning-Book: Code companion to the O'Reilly "Fundamentals of Deep Learning" book
is available on GitHub.
Principal Component Analysis (PCA) on Stock Returns in R
Principal Component Analysis is a statistical process that distills measurement variation into vectors with greater ability to predict outcomes utilizing a process of scaling, covariance, and eigendecomposition.
MS Azure Notebook
The work for this is done in the following notebook, Principal Component Analysis (PCA) on Stock Returns in R
, with detailed code, output, and charts. An outline of the notebook contents are below.
Overview of Demonstration
- Supporting Material
- Load Data: Format Data & Sort
- Prep Data: Create Returns
- Eigen Decomposition and Scree Plot
- Create Principal Components
- FVX using PCA versus Logistic Regression
- Alternative Libraries: Psych for the Social Sciences
Exercises in Programming Style by Cristina Videira Lopes
Exercises in Programming Style
Cristina Videira Lopes
5 of 5 stars
An easily consumed, enjoyable read, and excellent review of the history of programming style, from older days of constrained memory and monolithic styles, through pipelining and object-oriented variants, to more recent patterns like model-view-controller (MVC), mapreduce, and representational state transfer (ReST). Along the way, each variant is described, along with its constraints, its history, and its context in systems design.
View all my reviews
Data Mining for Fund Raisers
This is a repost of a Goodreads' review I did a little over 4.5 years ago, for a book I read twelve (12) years ago, which seemed relevant, as the industry seems to be picking up a data-driven focus. Plus, the world is now being transformed by advances in machine learning, particulary deep learning, and the large data sets and complexity of donor actions should greatly benefit from analysis.
Data Mining for Fund Raisers: How to Use Simple Statistics to Find the Gold in Your Donor Database Even If You Hate Statistics: A Starter Guide
by Peter B. Wylie
My rating: 4 of 5 stars
My spouse, at times a development researcher of high-net worth individuals, was given this book because she was the 'numbers' person in the office. Since my undergraduate was focused on lab-design, including analysis of results using statistics, I was intrigued and decided to read it. Considering my background, I found some of the material obvious, while other aspects were good refreshers on thinking in terms of statistics.
Below is the synopsis I wrote at the time:
Purpose of Book
How the Process Can Improve Endowment Activities
- To provide a general outline of a statistically-oriented method to improve funding activities by mining your current donor database
- To provide general techniques for analyzing data, as well as provide cautions against bad techniques
Outline of Method (Non-Technical)
- Allows the organization to more accurately target quality prospects, either to increase participation rates, or to find major givers more inclined to donate
- Allows the organization to reduce costs, or more effectively use limited resources, i.e., phone smaller sets of people, limit the size of mailings, while increasing donations
- Export sample of donor database
- Split sample into smaller components
- Find relationships between donor features and giving
- Select the significant variables
- Develop scoring system
- Validate findings
- Test finding on limited appeals and compare results
- Assumes the donor data is extractable and randomized
- Requires export from donor database, or access via SQL
- Assumes additional software for statistics (DataDesk, SAS, SPSS)
View all my reviews
- Requires IT staff, analytical staff, donor contacts, and management to coordinate efforts
- Requires IT and analytical staff have adequate skills to implement
- Judges variables of data by both its intrinsic value and based upon its inclusion in database
Tips for Staying Employed as an Older Developer
A response to an article Tips for Staying Employed as an Older Developer
A bit about myself, older and working as a developer, team lead and project manager, writing here to add to the options for staying relevant, and how to let the world know about it.
- GitHub - I have several libraries in C# and F#, so that others can directly use and evaluate my code
- NuGet - Packaged versions of code shared on GitHub
- Blogs - lately I have been learning data languages, and have several, focused on design patterns and algorithms, as well as one focused on data analysis using R, Python, and F#
- Websites - I have several sites, along with blogs, all accessible from a primary site, James Igoe. This site has links to other sites and blogs, one of which is an older site where I share code as downloads - this site predates GitHub - in my core languages, VBA, C# and SQL, tools for doing programming interviews, as well as cheat sheets.
- Reposts of career and tech-related articles on LinkedIn, GooglePlus (communities), Twitter, Facebook (page)
- Training - Yes, like others I am always learning, but I also share the material I work through and my opinion about it, meaning writing book reviews and sharing my opinion on courses from Pluralsight.
Value-at-Risk (VaR) Calculator Class in Python
As part of my self-development, I wanted to rework a script, which are typically one-offs, and turn it into a reusable component, although there are existing packages for VaR. As such, this is currently a work in progress. This code is a Python-based class for VaR calculations
, and for those unfamiliar with VaR, it is an acronym for value at risk, the worst case loss in a period for a particular probability. It is a reworking of prior work with scripted VaR calculations
, implementing various high-level good practices, e.g., hiding/encapsulation, do-not-repeat-yourself (DRY), dependency injection, etc.
- Requires data frame of stock returns, factor returns, and stock weights
- Calculate and return a single VaR number for different variance types
- Calculate and return an array of VaR values by confidence level
- Calculate and plot an array of VaR values by confidence level
Still to do:
Note: Data to validate this class is available from my Google Drive Public folder
Calculating Value at Risk (VaR) with Python or R
The following modules linked below are based on a Pluralsight course, Understanding and Applying Financial Risk Modeling Techniques
, and while the code itself is nearly verbatim, this is mostly for my own development, working through the peculiarities of Value at Risk (VaR) in both R and Python, and adding commentary as needed.
The general outline of this process is as follows:
Load and clean Data
Calculate historical variance
Calculate systemic, idiosyncratic, and total variance
Develop a range of stress variants, e.g. scenario-based possibilities
Calculate VaR as the worst case loss in a period for a particular probability
Review: The Systems View of Life: A Unifying Vision
My rating: 5 of 5 stars
An excellent, incredibly insightful and informative book, somewhat marred by the tedium experienced in the authors' rehashing the ideas of organizations working for change. For most of this book, the writers masterfully tie together concepts in systems, mathematics, consciousness, the environment, society and biology, and for that, it is a brilliant read.
The Systems View of Life: A Unifying Vision
by Fritjof Capra
View all my reviews
Review: Make Your Own Neural Network
As part of understanding neural networks I was reading Make Your Own Neural Network by Tariq Rashid. A review is below:
Make Your Own Neural Network
My rating: 4 of 5 stars
The book itself can be painful to work through, as it is written for a novice, not just in algorithms and data analysis, but also in programming. For the neural network aspect, it jumped between overly simplistic and complicated, while providing neither in enough detail. That said, by the end I found it a worthwhile dive into neural networks, since once it got to the programming structure, it all made sense, but only because I stuck with it.
View all my reviews
F# is Part of Microsoft's Data Science Workloads
I have not worked in F# for over two (2) years, but am enthused that Microsoft has added it to it languages for Data Science Workloads, along with R and Python. To that end, I hope to repost some of my existing F# code, as well as explore Data Science Workloads utilizing all three languages. Prior work in F# is available from learning F#
, and some solutions will be republished on this site.
Data Science Workloads
Published Work in F#
I explored F# some time ago, intrigued by the idiom of functional languages and the strengths of F#. Additionally, some solutions can improved upon by using .NET components to speed the process, e.g., parallel processing, and the language itself is not simply functional, but can also be used for object-oriented and procedural development.
- Functions (non-state, no side effects)
- Tail recursion
- Non-mutable variables
- Lambda notation
- Pattern matching
- Sequences, arrays, lists, and tuples
- Array slicing
Pluralsight Courses - Opinion
My list is kind of paltry, but I’ve sat through others or started many but decided against finishing. The best courses I’ve finished have been along the lines of project management:
I’ve also sat through this, and useful, although very rudimentary:
I do my own reading for data science, and have my own side projects, but I’ve also taken some data science courses via Pluralsight. The beginner demos are done well, although less informative than the intermediate ones, which are ultimately more useful. For the latter, I typically do simultaneous coding on my own data sets, which helps learn the material.
Geert Hofstede | Defined Corporate Culture
I've been interested in Hofstede's work since B-school, back in the early second millennia, and at one time considered publishing using his country characteristics as predictors for economic and social welfare outcomes. Nowadays, I use the results of his analyses frequently in small R programming demonstrations.
He's an interesting researcher, who's done important work, as The Economist article describes
The man who put corporate culture on the map—almost literally—Geert Hofstede (born 1928) defined culture along five different dimensions. Each of these he measured for a large number of countries, and then made cross-country comparisons. In the age of globalisation, these have been used extensively by managers trying to understand the differences between workforces in different environments.
The Economist article
give a fuller picture of Geert Hofstede
, and anyone interested in reading one of his works might enjoy Cultures and Organizations: Software of the Mind, Third Edition
. An interesting dive into Geert's research and its implication, with a fairly high reader score.
As for sampling of analyses I've posted using Hofstede's cultural dimensions:
Neural Network Series in R
While developing these demonstrations in logistic regression and neural networks, I used and discovered some interesting methods and techniques:
A few useful commands and packages...:
- update.packages() for updating installed packages in one easy action
- as.formula() for creating a formula that I can reuse and update in one action across all my code sections
- View() for looking at data frames
- fourfoldplot() for plotting confusion matrices
- neuralnet for developing neural networks
- caret, used with nnet, to create predictive model
- plotnet() in NeuralNetTools, for creating attractive neural network models
Resources that I used or that I would like to explore...
- MS Azure Notebooks, for working online with Python, R, and F#, all part of MS's data workflows
- Efficient R Programming, that seems to have many good tips on working with R
- Data Mining Algorithms in SSAS, Excel, and R, showing various algorithms in each technology
- R Documentation, a high quality, useable resource
To explore this series...
Microsoft Azure Notebooks - Live code - F#, R, and Python
I was exploring Jupyter notebooks
, that combines live code, markdown and data, through Microsoft's implementation, known as MS Azure Notebooks
, putting together a small library of R and F# notebooks
As Microsoft's FAQ for the service describes it as :
...a multi-lingual REPL on steroids. This is a free service that provides Jupyter notebooks along with supporting packages for R, Python and F# as a service. This means you can just login and get going since no installation/setup is necessary. Typical usage includes schools/instruction, giving webinars, learning languages, sharing ideas, etc.
Feel free to clone and comment...
Review - Efficient R Programming: A Practical Guide to Smarter Programming
Efficient R Programming: A Practical Guide to Smarter Programming
5 of 5 stars
Simply a great book, chock full of tips and techniques for improving one's work with R.
View all my reviews
Data Analytics Workouts: Support Vector Machines on Big Five Traits and Politics
Working through a Pluralsight tutorial on Support Vector Machine, I adapted it to one of my usual data sets, Hofstede Cultural Dimension and Big Five state-level personality traits:
Support Vector Machines on Big Five Traits and Politics
Data Analytics Workouts: Charting Correlation Matrices in R
I noticed this very simple, very powerful article by James Marquez, Seven Easy Graphs to Visualize Correlation Matrices in R
, in the Google+ community, R Programming for Data Analysis
, so thought to give it a try, since I started some of my current analyses a decade ago by generating correlation matrices in Excel, which I've sometimes redone and improved in R.
Charting Correlation Matrices in R
Data Analytics Workouts: Data Mining Explorations
Another Pluralsight training course, Data Mining Algorithms in SSAS, Excel, and R
, had one sections on Navie Bayes and on decision trees. I worked through the example, but varied using my own data and code, linked below.
Naive Bayes on Political Outcome Based on State-level Big Five Assessment
Decision Trees (party) on Political Outcome Based on State-level Big Five Assessment
Data Analytics Workouts: Predicting Google's Stock Price
Running through a Pluralsight training course, Understanding and Applying Logistic Regression
, an example was code predicting Google's stick price from SPDR ETF's price changes, although tweaked to my own style...
Logistic Regression on Stock Data using Google and SPY (SPDR S&P 500)
Review: Text Analysis with R for Students of Literature (Quantitative Methods in the Humanities and Social Sciences)
Text Analysis with R for Students of Literature
by Matthew L. Jockers
My rating: 5 of 5 stars
Engaging writing, with code samples and practices. As for programming, I thought the code quality was somewhat low or sloppy, but Jockers is not a software developer by trade. While reading, I did have a few ideas to solve some text-matching issues across systems, and generally, I found the lack of discipline in the author's approach conducive to flexible thinking about using techniques with R.
View all my reviews
Cultures and Organizations: Software of the Mind, Third Edition
Although this is not a technical book, it is highly relevant to management, project and otherwise, particularly if you work in technology. Our teams are often composed of internationally originating, and internationally located members, and understanding how such difference might affect team dynamics is essential.
Cultures and Organizations: Software of the Mind, Third Edition
4 of 5 stars
A detailed and fascinating review of Hofstede's dimensions, by the researcher himself, showing broad high-level insights into history and culture, although a bit tedious, as it often describes in detail relationships many of us implicitly understand.
View all my reviews
Data Analytics Workouts: Recent Post using R
I have dabbled with statistical analysis since college, eventually wanting to publish academically while I was in B-School, and recent growth in big data and analytics motivated me to update some of my prior analyses with new technologies. This is not a prior analysis, but I was struck by the lackluster reporting of the correlation between obesity and indulgence, and wanted to delve further, to look at the compound relationship between both indulgence and LTO, e.g., does shortsightedness and indulgence lead to obesity.
Hofstede's Long-term Orientation and Individuality: Obesity Relationships (using R)
Transitioning to Project Management
As I transition back to project management work, as well as pick up more business analysis skill, I have found it useful to work through a few Pluralsight videos. First, I was quite surprised how much I got from learning more about MS Project. It is much more than just Gantt charting! As for project management, some elements are native to anyone that plans, like a work breakdown structure or the timeline, but there are very important aspects of issue control and communication that need to be part of one's PM toolkit.
Ten Simple Rules for Effective Statistical Practice
A interesting article by six statisticians, Ten Simple Rules for Effective Statistical Practice
, and their aim:
To this point, Meng notes "sound statistical practices require a bit of science, engineering, and arts, and hence some general guidelines for helping practitioners to develop statistical insights and acumen are in order. No rules, simple or not, can be 100% applicable or foolproof, but that's the very essence that I find this is a useful exercise. It reminds practitioners that good statistical practices require far more than running software or an algorithm."
The 10 rules are:
- Statistical Methods Should Enable Data to Answer Scientific Questions
- Signals Always Come with Noise
- Plan Ahead, Really Ahead
- Worry about Data Quality
- Statistical Analysis Is More Than a Set of Computations
- Keep it Simple
- Provide Assessments of Variability
- Check Your Assumptions
- When Possible, Replicate!
- Make Your Analysis Reproducible
Data Analytics Workouts: Recent Posts using R
Below are some recent posts regarding my work learning R, where I took prior work I had done in Excel, country correlations on human welfare, or F#, solving Project Euler problems, and re-executed them, and to some degree improved upon them.
Plotting Text Frequency and Distribution using R for Spinoza's A Theological-Political Treatise [Part I]
Inequality Kills: Correlation, with Graph and Least Square, of Gini Coefficient (Inequality) and Infant Death
Chi-Square in R on by State Politics (Red/Blue) and Income (Higher/Lower)
Logistic Regression in R on Politics and Income
Multiple Regression with R, on IQ for Gini and Linguistic Diversity
Linear Regression with R, on IQ for Gini and Linguistic Diversity
Mean Median, and Mode with R, using Country-level IQ Estimates
Correlations within with Hofstede's Cultural Values, Diversity, GINI, and IQ
ANOVA with Hofstede's Cultural Values and Economic Outcomes
Text Parser and Word Frequency using R
Project Euler: F# to R
Data Analytics Workouts: Recent Posts using Python
I have been exploring Python and R for manipulating data. My long terms goals for this exercise, that I document on the blog, are to develop skills in the aforementioned languages, as well as extend my abilities with F# and explore other technologies like NoSQL Db's. Things like Hadoop and Spark are likely much farther down the road, if at all.
Exercises: OESMN (Obtaining, Scrubbing, Exploring, Modeling, iNterpreting)
OESMN (Obtaining, Scrubbing, Exploring, Modeling, iNterpreting): Getting Data
Computational Statistics in Python: Exercises
Promising Power: functools and itertools of Python
Iterators, Generators, and Decorators in Python
Recursion in Python
Functions are first class objects: Higher Order Functions
Text Analysis with R for Students of Literature (Quantitative Methods in the Humanities and Social Sciences)
Text Analysis with R for Students of Literature
by Matthew L. Jockers
My rating: 5 of 5 stars
I've just started reading this, but so far it's very promising...
View all my reviews
New NuGet Publication - API Access
I recently published a new NuGet libraries, API Access
, a simple package for easy access to API's, but only currently supporting Glassdoor. It returns the JSON result, as well as converts the result into a full range of necessary classes, accessed via RootObject.
Review: Macroanalysis (Topics in the Digital Humanities)
by Matthew L. Jockers
My rating: 4 of 5 stars
Great foray into data mining literature, although at times a bit tedious and redundant. Conceptually useful, for general concepts in data mining, but with no actual coding.
View all my reviews
Review: Cassandra High Availability
Cassandra High Availability
by Robbie Strickland
My rating: 5 of 5 stars
An excellent high-level view, providing direction and guidance for relative newcomers to Cassandra. My background is 10+ years with relation Db's, covering SQL Server, Oracle, Sybase, cubes, data marts, etc., and this pointed out the mistakes that can be made by someone with a relational Db background. It's an easy read, well-written and high-level, with enough code to make the point, but not so much that reading becomes drudgery.
View all my reviews
Proposed Site Architecture
Review: NoSQL Distilled
NoSQL Distilled: A Brief Guide to the Emerging World of Polyglot Persistence
by Pramod J. Sadalage
My rating: 5 of 5 stars
For developers, an excellent overview and primer of the new database types. Although I think that one needs a good understanding of numerous technology-related topics, this is a fairly light introduction covering the NoSQL incarnations.
View all my reviews
Hackers Prove They Can ‘Pwn’ the Lives of Those Not Hyperconnected
an interesting article
, but one almost immediately ruined by the naivete of the target.
One phrase, click-bait, ruins this. As an non-hyper-connected individual she did not have enough savvy to avoid the most common ruse. She was trusting, in a way she should not have been. I could go on - pardon if this is like blaming the victim - but one needs to be cautious, and in fact, untrusting, or electronic media. Consider the source is no less true in this case than with material one reads or hears.
Question from another reader:
What does the phrase "click-bait" ruin and why?
The title of the article made me think the hack was of a person unconnected from the internet, or possibly the opposite, using IoT. Also, there was nothing ingenious about this. It was the typical old-school social hack, getting someone to open something they should not, like movies, documents, etc., that are actually viruses or links to cross-scripted sites that hijack your entries. The hack was simply finding one of the millions of unsuspecting, naive users...
Installing Cassandra on Linux in Virtual Box
I was looking to get familiar with Cassandra, so in a VM, loaded Ubuntu, Java, and Cassandra. In retrospect, the following links were the fastest way through the task...
This helped as well, Linux Directories
Spatial Ability, Creativity, and Innovation
In response to Are We Ignoring Spatially Talented Creatives, and Losing Innovations along the Way?
I wrote the following:
The types of tests have changed, and maybe narrowed a bit, but when I was 12 years old and in middle school, junior high to some, we took the Differential Aptitude Tests battery. It included, among other things that are standard nowadays, tests for abstract reasoning (non-verbal, non-numeric), mechanical reasoning, and space relations, and these three (3) tests were deemed predictors of "mechanical, technical, and skilled industrial work." Maybe decomposing the standardized tests into batteries might better predict preparedness for modern work, inclusive of, but expanding on, the 3 R's of reading, writing 'rithmetic, and in particular, people that are good at "bending metal."
What is your proficiency with the top 20 TIOBE languages?
If you can, please take this survey I created via SurveyMonkey asking "What is your proficiency with the top 20 TIOBE languages?"
Software Engineers and Obsolescence
An article Software engineers will be obsolete by 2060
, prompted a response from me, citing an interesting article from The Economist titled Automation on Automation Angst
, that looks at several publications that look at the historical effects of automation, and although there is always fear of being replaced, ultimately more jobs are created than destroyed. Software engineers disappear? So what! There will be other jobs, with different titles, and in the interim, the more people use tech, the more there will be a need for software engineers.
Response to Request for Opinion
Some quick thoughts on architectural constraints. At best, this software will have 100-200 users. Not more than that. The software will not have to handle more than a thousand transactions a day. There will be realtime updates via Bloomberg data libraries etc so whichever technology stack we use needs to be robust. There isn’t any super low latency requirements like high frequency trading but the software should feel responsive like a desktop application does.
Would love to hear your thoughts. Thanks in advance.
It is not either, and it is likely both. Hypothetically, you would like to make an infrastructure that is platform-independent, e.g., service-based, so that you can reach your data and processes regardless if it is web-based, desktop, or server-side. As for the world going web, I have seen a big increase in WPF job openings, particularly in the trading sphere.
As for specifics, web is lightweight, desktop is heavyweight. The easiest rule of thumb, and by no means comprehensive, is that if you need to manipulate and display large amounts of data, need access to a desktop notification system, need live updating, and need Excel/Bloomberg integration, you would choose desktop. For lightweight apps, that might have less demanding needs, you could use web, and if there are workflows and processing required for reporting and order flow, you might build a set of services and processes that can be triggered from the lightweight front-end to a more serious server-based system.
You might want to keep your technology stack small, since your needs are small, so go with Microsoft, unless your skill sets are elsewhere:
As for third-party tools
- SQL Server, with SQL and T-SQL
- WCF, et al., for services
- WPF desktop, using MVVM pattern and/or Prism
- ASP.NET MVC for web, with Entity Framework
- SSRS for basic reporting
- VSTO for Excel add-in integration
- C# as the common language for desktop, web, and server
A personal dislike is SSIS. Instead, code a system that can handle FTP, file transfers, and ETL; it is simple, and if architected well, should be able to grow as your needs grow.
- DevExpress or Telerik for server-side Excel generation
PS, Feel free to reach out if you want to chat.
published several NuGet libraries
, partially for the experience, since using NuGet for sharing code within an organization seems an easy and worthwhile solution.
Waiting for Deployment
I read an article this morning about the #NoEstimates movement,
Estimates? We Don’t Need No Stinking Estimates!
, which reminded of an author I have always liked, Samuel Becket. He has a very dark sense of humor, technically absurdist, and his characters’ statement are not literal. Anyway, it made me think of this, borrowing from Beckett:
Ever estimated. Ever failed. No matter. Estimate again. Fail again. Fail better.
Interesting new links
Some useful new links for mobile development from Google, and for WPF from Christian Moser:
Review - Framework Design Guidelines: Conventions, Idioms, and Patterns for Reusable .NET Libraries
Framework Design Guidelines: Conventions, Idioms, and Patterns for Reusable .NET Libraries
by Krzysztof Cwalina
My rating: 5 of 5 stars
Excellent material, a broad view for building for external clients...
View all my reviews
Review: Visual Basic.Net Threading Handbook
Visual Basic.Net Threading Handbook
by Kourosh Ardestani
My rating: 5 of 5 stars
My specialty has been as a VBA (Excel/Access) and SQL Server database developer, but I have been edging into .NET work over the past year or two. Although there are some legitimate gripes about this book, in general I've found it to be an illuminating resource. Threading is essential to .NET programming, and the general concepts here have helped me to understand the projects I've been involved with, in both C# and VB. I can say sincerely, that the overview in this book is unmatched by material read on the internet, both in depth and quality.
View all my reviews
Review - The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies
The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies
by Scott E. Page
My rating: 4 of 5 stars
Generally, I found the book most engaging for understanding perception, heuristics and decision making, although this did not seem to be the primary premise of the book. As for the writing, it was a bit long-winded, using analogies to make points, even though the concepts themselves are readily accessible without elucidation.
As to its purported focus, it provides academic, empirical, and statistical support for diversity, not necessarily racial or ethnic, with the premise being that diversity of viewpoint within groups is powerful, so much so that it trumps individual excellence.
View all my reviews
Review: C# in Front Office
C# in Front Office: Advanced C# in Practice
by Xing Zhou
My rating: 4 of 5 stars
I already know much of the code for this - I am a long time VBA programmer that also now codes in C# - and I found the information straightforward and useful. It will likely cause me to refine some of my existing code and it certainly spurred a few ideas for current development, mostly around RTD-like servers, but the grammar and syntax is at times atrocious. It is still readable, but be prepared for some jarring errors in English.
View all my reviews
Review:Design Patterns: Elements of Reusable Object-Oriented Software
Design Patterns: Elements of Reusable Object-Oriented Software
by Erich Gamma
My rating: 5 of 5 stars
Depending on on how you think of programming, this book could be incredibly insightful, or horribly abstract and impractical. Since I prefer and tend to think in patterns and abstractions, I found this book close to my heart. It uses a variety of languages for examples, so a willingness to explore concepts, not practical solutions, is essential.
View all my reviews
Review: Data Mining for Fund Raisers
Data Mining For Fund Raisers: How To Use Simple Statistics To Find The Gold In Your Donor Database Even If You Hate Statistics: A Starter Guide
by Peter B. Wylie
My rating: 4 of 5 stars
My spouse, a development researcher of high-net worth individuals, was given this book because she was the 'numbers' person in the office. Since my undergraduate was focused on lab-design, including analysis of results using statistics, I was intrigued and decided to read it. Considering my background, I found some of the material obvious, while others aspects were good refreshers on thinking in terms of statistics.
Below is the synopsis I wrote at the time I read it:
Purpose of Book
* To provide a general outline of a statistically-oriented method to improve funding activities by mining your current donor database
* To provide general techniques for analyzing data, as well as provide cautions against bad techniques
How the Process Can Improve Endowment Activities
* Allows the organization to more accurately target quality prospects, either to increase participation rates, or to find major givers more inclined to donate
* Allows the organization to reduce costs, or more effectively use limited resources, i.e., phone smaller sets of people, limit the size of mailings, while increasing donations
Outline of Method (Non-Technical)
1. Export sample of donor database
2. Split sample into smaller components
3. Find relationships between donor features and giving
4. Select the significant variables
5. Develop scoring system
6. Validate findings
7. Test finding on limited appeals and compare results
* Assumes the donor data is extractable and randomized
* Requires export from donor database, or access via SQL
* Assumes additional software for statistics (DataDesk, SAS, SPSS)
* Requires IT staff, analytical staff, donor contacts, and management to coordinate efforts
* Requires IT and analytical staff have adequate skills to implement
* Judges variables of data by both its intrinsic value and based upon its inclusion in database
View all my reviews
Review: A Technique for Producing Ideas
A Technique for Producing Ideas
by James Webb Young
My rating: 4 of 5 stars
If you 'drink' this book, as I did, it is because most of the ideas and methods for producing ideas are second nature to you. If you are not by nature an idea person, this book lays out methodical ways for generating ideas. Either way, the book can be used to expand one's normal idea generation process.
View all my reviews
Review: Programming Interviews Exposed:
Programming Interviews Exposed: Secrets to Landing Your Next Job, 2nd Edition
by Mongan, John
My rating: 4 of 5 stars
A bit turgid at times, going over the minutiae of of basic programming problems, but useful as a good overview for certain types of algorithms and for preparing for programming interviews.
View all my reviews
Align Secondary Axis in VBA, C#, and VB.NET
Recently, I was asked to align two axes in a particular chart as part of a process that generated 100+ files, each with a chart of this type, ran regularly. The code was prototyped in VBA, but I am also working with a team that will need to replicate the process in server-side .NET, so converted the code to a C# class, and then ran it through Telerik's code convertor
to get VB.
The general algorithm used for aligning a chart's secondary axis with the primary axis:
Get primary and secondary divisors
Get upper and lower bounds of secondary axis
Get larger of the larger absolute value of the secondary axis
Multiply (max/min) divisor by each of an array of numbers to find the first multiplier larger than absolute maximum
Apply multiplier to major unit
Apply min and max (multiplier x divisors)
The three versions are here:
Saying "No" to Learning...
The time I was most annoyed at someone for pushing learning a new language, it was [defunct technology]. A new developer in the group was pushing a C++/C# guy to learn [defunct technology], while the executive director agreed with me to just to let him create whatever-it-was in C#. The reasons I objected:
[defunct technology] is a dying language; [favored technology] is a better option
[defunct technology] is complicated, with limited future use
[defunct technology] would not advance the developer's career
The developer is most effective in his primary languages
[defunct technology] would make future support complicated and harder to fill
As for myself, I gladly learn technologies that I think would be useful, or might advance my career, or that I find intellectually engaging, although it is usually not learning a new language, but more likely an new architecture. My background is in MS Office, desktop, and database development, with light but varied work in web UI's. I was recently expected to pick up more ASP.NET MVC, which is not a radical departure from my own MVP/MVVM desktop designs, as well as things like Entity Framework. The work is fun enough, my productivity increased, they are marketable skills, and they are natural extensions of what I currently work with, although I would gladly learn pick up some newer technologies given the opportunity:
SAS / MatLab /R
New Code: Threaded SQL Execution (C# + Facade for VBA)
A C# library
that simplifies some aspects of connecting to Db's and running SQL statements in a threaded manner. The C# code can also exposed to Excel VBA, enabling it to simultaneously execute numerous SQL statements, something that cannot normally be done in VBA.
The problem with analogies, is that they seem to work on the surface, but often hide flaws in reasoning. In any profession, there are gradations of ability, scope, and intellect. In the same way that there are DIY plumbers, there are business users who automate tasks, but architects, and engineers are the equivalent of mechanical engineers that build water systems and design controls. Yes, you could not be a doctor or lawyer, but those are guild-like with particular requirements, and those professions have a similar hierarchical structuring of ability, from person tending their own minor injuries to medical assistant to nurse to nurse practitioner to physician's assistant, etc. Even then, the requirements behind licensing are driven by the government taking an interest in guaranteeing minimum levels of competency in proprietors serving the public, so even cutting hair requires training and a license.
I was saying this recently to colleagues at work, that the barrier for entry into programming is very low, and that one cannot be a practicing engineer with at least a degree, several years experience, and some successful projects. In the same way, one might be hard-pressed to work professionally as a developer without a similar pedigree.
Four files, one for the js, one for the CSS, one for Modernizr, and one for the HTML/PHP
, that work together to create dynamic, rectangle layouts in a canvas.
The original code
was inflexible, with hard-coded height, width, and rectangle number, from which I changed to an object, and one that has a constructor with parameters.
Workbook Close Event Capture
For anyone that has had to capture the WorkbookBeforeClose event and struggled with the imperfectness of the process, this is C# code that fires only if the workbook is completely closed
Kudos to the developer
; Wordpress entry is here
Development Speed in VBA
Lately, I have been asked why something cannot be done faster:
- Real estimates for delivering quality code are surprisingly longer because developers can be people pleasers or overly optimistic. They often underestimate the amount of work required, as well as not being able to foresee bugs and data issues.
Developers differ in speed, and it would be unrealistic to expect all developers to code at the same speed as the best developers.
There is no TDD for VBA. Developers test and validate results, which in a VBA environment can be time-consuming.
Development is more than just being code monkeys. Solutions have to be complete, work within the current system, and be supportable by others.
When code is new, some experimentation is required to make it complete.
As far how to make the process faster:
Speed improvements are usually developed over time by creating and reusing functions, classes, patterns, and libraries, as well as using third party tools
Additional speed improvements can be had by managing projects and self-managing better, i.e., avoiding multi-tasking, better planning, collecting requirements.
OpenXML Document Cache for Office Apps
Working in either VB, C#, or VBA, you can add hidden data to Excel workbooks, which I am finding useful to store information about queries executed for PM's/traders. An MSDN article
Additionally, here is some >helper code in VBA
and in C#
I have created to read and write the XML.
VBA versus .NET
During a LinkedIn discussion on the value of VSTO add-ins, I wrote about a recent experience choosing between .NET and VBA for Excel:
Some recent reasons I decided to use .NET over VBA:
Asynchronous operations and threading
Version control, both via TFS and on users' desktop
Developer understanding - many shops know C#, not VB (VBA)
One recent reason I needed to revert to VBA:
A portfolio manager needed over 49,000 rows executed and returned to a sheet. When using .NET I could easily uses events to have data retrieved off-hours, execute multiple queries simultaneously and asynchronously, cache the data, and use LINQ to filter it, but VSTO took 40 seconds to write the data, while VBA took seconds. Even though I had all that power with .NET, that slow worksheet dump killed the use of .NET.
: I eventually found a solution in .NET.
Advice to a Young Programmer
In response to a Google+ group question, "Hey guys! Does anyone have an tips on how to pre-plan a library or a program in general?"
, I wrote the following:
Just some general ideas:
Consider what you want it to provide
Consider best architecture/language/audience for purpose
Develop use cases
Abstract from the concrete as much as possible
Consider objects, properties, methods
Consider the relationship between objects
Consider the limitations of your choices
Build logging/debugging/messaging in from the beginning
Prototype and validate your ideas before implementing
For myself, I like to head out to a coffee shop and draw my app, the UI, the classes and relationships, and messaging between components. It looks a bit like a UML diagram.
I recently built a first release Excel add-in for a portfolio manager to retrieve somewhat large data sets (50K rows by 85 columns). The PM's requirements were speed, query flexibility, and small UI footprint. This size is not normally a problem, but there are limitations in .NET when dealing with Excel. I have separate libraries for threaded SQL execution, desktop logging - there are better off-the-shelf products like Log4Net - , and query table management. The architecture is Model-View-Presenter since that both works well for this kind of app, but also provides an upgrade path to a Web UI. The Excel UI is a mixture of Ribbon groups and a separate task panel.
I searched to find best methods for dynamic SQL that avoided SQL injection, as well as experimented a bit to find a fast method for retrieving large data sets to Excel via .NET. I used a query table, native to Excel, which provided several requirements natively.
Working with CustomXmlPart
shows the basics of working with CustomXmlParts of the Office 2007 and greater environment. It is a way to store complex information within workbooks. In this example, I create a type, create XML to store the type information, and then, in different methods, set and retrieve the type.
Technology Workers Are Young (Really Young)
This will seem like biased support of older workers, but it is simply a counterbalance to the inherent bias toward younger workers as described in this NY Times piece
. In truth, younger workers have many qualities to their benefit, but the issue is manifold:
- More experienced workers are presumed to be less skilled, less compliant, etc., but recent studies have shown that older workers have comparable skills compared to younger workers. In one study, the most skilled and knowledgeable were in their 40's, with less 'skill' to either side.
Younger workers are sometimes assumed to be more analytical, while older workers more productive and stable, albeit more expensive. Younger workers are presumed to have more g (the fluid intelligence stuff), part of creativity. Productive creation often requires a mix of new insight mixed with crystallized knowledge, but the experiential aspect is dismissed in the drive for new.
Cost plays a part, and smaller companies will look for cheap employees, and despite productivity, younger workers are often chosen.
Younger employers, hence younger, managers are biased to hire younger workers via affiliation, as well as concerns about control
Younger workers have lower opportunity cost, and often take riskier employment without concern for repercussions older workers might have already experienced
People are notoriously bad at hiring, and considering the complexity of the competing facets, i.e., cost, productivity, skill assessment, etc., hiring is malformed
Appearances also play a role, in that many older workers likely look particularly old, e.g., overweight, balding, etc., and younger companies might be loath to hire them. It pays to be fit.
Software Design Patterns And Architecture
This is the second response to a LinkedIn Software Architecture group question, Develop classes using Abstractions or always program to an interface:
Are you seeing interfaces, but only one instance of the class, or only one class requiring the class implementing the interface? That sounds unnecessary. If you have multiple classes implementing the same interface, that sounds more appropriate. Interfaces are to some degree contracts, and they are there to force conformity to a minimum standard while allowing flexibility for the concrete implementations. Inheritance can provide the same restriction, but it is often too restrictive, leading to suggestions to prefer composition.
Abstraction is almost always better than concreteness. When I think of bad code, I imagine code that is coded for a specific instance and is inflexible. Granted, there are times, particularly for one-off scripting, where literalness is the correct choice, but when designing a system, particularly one that is expected to grow and change, abstraction is the better choice, the only issue is the quality of the abstractions. In VS, it is possible to extract an interface from a class, but this only makes concrete decisions seem abstract, when in fact, the concrete class could simply be the result of a series of decisions based on existing issues, with little abstraction involved.
Software Design Patterns And Architecture
This is a response to a LinkedIn Software Architecture group question, Develop classes using Abstractions or always program to an interface:
Personally, I find that starting with abstract ideas/plans are usually best for long-term development. When I have been driven to write overly concrete implementations, I have hated the resulting project's limited flexibility.
There might be several reasons for what you are seeing, some of which you have already mentioned:
Foresight, in that someone was expecting changes or additions that are not currently evident
Some technologies expect interfaces, e.g., MVP, WCF services, IOC...
In reference to MVC: "...models strongly typed, minimized the code needed for the model since it's just an interface but also keeps anyone from accidentally pushing logic down into the model..."
With the code analysis built into premium versions of Visual Studio give points for abstraction. I typically try to attain a high maintainability score, and abstraction, and interfaces provide that
The person who started the design was enamored with something they had just read so created everything with interfaces. Subsequent developers maintained the 'pattern'
The interfaces, depending on the IDE, might be afterthoughts, such that someone created a concrete implementation, then later right-clicked and generated an interface from it. Visual Studio (VS) lets you do this.
Worksheet Wrapper Classes
Recently published for VBA, an idea I developed to avoid using arrays while refactoring spaghetti code, uses a parameter class for the column values of a row, and then a class that would hold a collection of parameter (row) classes
, providing type ahead and methods to get values, as well as provide basic properties like Count and Item.
Preferred Documentation for Support
Currently producing documentation for a Office VBA project I recently refactored, I wanted to produce the most useful material for support.
The top of the results:
- Source Code
- Code Comments
- Data Models
- Requirements description
The link for the PDF is here
Phantom Debug Error Messages
Phantom debug error messages while coding VBA has always been a thorny problem, but I recently ran across a technique that seems to work. This is the quote of the person that presented it:
After some googling I've found better (quicker) solution: just end macro, switch to the VBE and hit Ctrl+Break. After that the code execution doesn't break.
As a series of steps:
- When the error message occurs, click Debug
- Press CTL+BRK
- Depending on the code location press F5
New Twitter Identity
With free time between clients, I have been 'modernizing' websites to HTML5, expanding skills in WPF/Silverlight, and generally increasing my internet visibility. Along those lines I added Google+ and Twitter buttons to my sites so my personal work can be liked. I also created a new Twitter identity, James Igoe @ CodeIgoe
, that I promote separate from my private Twitter identity.
Added Silverlight Log Viewer
I recently replaced a log view based on a web forms, using either paged SQL or an ASMX-tpe web service, with one based on Silverlight using the ASMX-type service
. It was quite easy, since a most helpful article
showed how to deal with Silverlight's asynchronous data model and its restricted security.
Although web development is not my focus, I have spent time using various site testing tools, in particular Nibbler by SilkTide
to make enhancements:
- Convert from tables to div and semantic enclosures
- Scaled images to improve performance
- Reduce CSS file size by converting from table to div
- Implemented GZip compression where available
- Implemented print style sheets
Nibbler also provides a way of owning and revisiting sites
, as well as as a social aspect via profiles
On my own time I develop and maintain several websites, and the one that I am currently focused on is CodeDotNet.com
, an ASP.NET site with sample applications built in Silverlight. It also exposes several SOAP-style services.
I plan on expanding the services by developing several REST
ful services, of which I will either use WCF
. or Web API. Addionally, I plan on consolidating and expanding the existing WPF and Silverlight applications into one Silverlight application that can run on the desktop. The new services and the expanded app will interact.
New Site, New Code
Now that I am in between assignments I took time to rework my CodeDotNet.com site
with several significant enhancements, as well as expanded my skills with WPF and Silverlight.
- Created a flat appearance
- Created a new menu system
- Converted to HTML5/CSS3
- Added WPF/Silverlight applications
- Added pages for a planned option calculation service
I see strong market demand for WPF, as well as a significant code quality improvements using the newer MVVM-based UI design pattern. Additionally, I expect that I will expand my exploration of services and web APIs.
More Design Patterns
Below are additional example design patterns, initially posted on my DesignPatternInDotNet blog
postings, posted here as downloadable code modules, making them more accessible for other developers and for search engines:
Windows 8 UX Design Camp in New York City
I am pleased to find out my registration has now been accepted for the event November 28-29.
I recently uploaded example code for a handful of design patterns, part of a self-directed exercise coding each one, primarily to try and understand the nuances of each. The code is implemented in C#.
Similarly, I have been going through the basic algorithms for sort, search, path, etc. in a similar attempt at clarification.
What Makes Great Programmers Different?
This has been reformatted for legibility from the original article
The keys to being a good programmer are well known. Greatness, however, requires something else altogether.
Generally have a small skill and are unaware of its limitations.
The Dim, the Reckless, and the Jerks
The Good Guys
- The Dim, those that avoid deep technical conversations, but often overreach in the estimation of what they're capable of doing or what they know
- The Reckless, are those who have the skills but do not have the discipline, the cowboy programmers of yore. They code according to their own desires, do nothing to integrate their work with that of others, scrimp on basic discipline, and cause work for other team members
- The Jerks, are those that wield their roles so poorly that they drag down the entire team. They often come from one of the other two previous groups, but have been given enough responsibility that they can make life miserable for others. The only solution when faced with them is to hope that they eventually are let go or to leave yourself
- The competent desire the right outcomes, and strive to contribute to the team.
- They frequently have well-defined skills, a good understanding of development disciplines, and are open to new approaches as long as the material can be integrated with what they already know.
- They write solid code, although only occasionally does it show a truly imaginative solution.
- More commonly, when faced with a difficult problem, they will lapse into quick and dirty hacking.
- They block themselves from greatness by not having the curiosity or knowledge at their command to be more than they are. That is, they refine their skills principally by continued application, not by learning new technologies — unless required to do so by job demands.
- Such programmers are at risk of slipping into the lower grouping by letting their skills atrophy. I discussed several examples of this in my previous editorial on coding from within the echo chamber.
Really good programmers, in addition to the above:
- They carry an abiding passion for programming.
- They like to solve challenging problems and they like to solve them well.
- They are not satisfied with writing one more CRUD app.
- They want the magical work that is hard and requires extended effort to bring to fruition.
- If they cannot find this satisfaction in the workplace, they find it in personal projects or by contributing to open-source projects.
- They frequently test new technologies, try out new languages, explore new tools, and read about programming.
- These developers are interested, consume programming books, and hide out in developer forums as well.
- They revel in challenge and have a constant sense of searching.
- They are looking for the best answer to a problem, or the most elegant.
The next tier up — the final tier — consists of great programmers who have supernormal gifts that enable them to do easily what good programmers find difficult. To my eye, the traits that most stand out are three in number:
- An excellent memory
- Conspicuously wide knowledge of programming techniques and the judgment to know when to apply which
- A deep ability to relate details to the larger picture
The last trait — being capable of quickly shifting registers from the large picture to small details and back again — relies on the strong memory and operates on an almost automatic basis. There is an effortlessness to it, which makes them particularly good architects.
There's one discipline they all share as well, which appears only in varying degrees in the earlier levels: Without exception, they possess a very deep and intimate knowledge of their tools. Be it the editor, the compiler, or the framework, they know the ins and outs of its features and they navigate efficiently. They use a much wider array of features because they know exactly how the tools work.
Knowledge of tools, coupled with an extensive, tested palette of programming techniques, and the ability to remember large amounts of the code base, while relating low-level details to the whole with unconscious ease — these are the traits I see most often in great programmers. And they're the skills I continue to aspire to in my work.
Dispatching Observable Collection
A collection class that handles cross-thread updating on ObservableCollection, in both C# and VB.NET. The source
contains the original code in C#, as well as code converted to VB.NET.
The original is from michIG's Blog
A Surge in Learning the Language of the Internet
Below is a comment I left on a NY Times article:
My father worked with computers in the military during the Korean War in the 50's and with corporations in the 60's. He died prematurely, but his brother saw the industry growth and got involved, programming for major corporations through the 70's and 80's. For much of my young adult life I heard "go into computers, it's the wave of the future." I find it amusing, in that it still is. The 'nerds' are a bit less nerdy, but still considered odd and of lower status than other professions.
As for myself, I avoided the 'wave of the future' for quite some time, but with half a CS degree, finishing a BA in Psych and half an MBA, I work as a software developer. Other than the basics I learned in 80's college, i.e., BASIC, PL/I, COBOL, loops, etc., most of my knowledge is self-taught, with books covering algorithms, patterns, data, databases, and software architecture.
A Surge in Learning Language on the Internet
New F# Modules
The code base now includes a few F# modules created as part of my learning the language using Project Euler. One is a basic class for calculating standard deviation and variance, while the other contains methods for calculating speicalized series, e.g., Collatz, Terra, Fibonacci and primes, and/or calculations about those series:
Thread Type Selection
Below are criteria to use when selecting a threading type, from a user claws
in a stack overflow discussion It looks as threads in three different categories depending on the target use, asynchronous delegates, BackgroundWorker, or as ThreadPool:
Asynchronous Delegates - Used when you have work items that should be handled in the background and you care when they finish.
- Use BackgroundWorker if you have a single task that runs in the background and needs to interact with the UI. and use it if you don't care when they finish their task. The task of marshalling data and method calls to the UI thread are handled automatically through its event-based model.
- Avoid BackgroundWorker if (1) your assembly does not already reference the System.Windows.Form assembly, (2) you need the thread to be a foreground thread, or (3) you need to manipulate the thread priority.
- Use a ThreadPool thread when efficiency is desired. The ThreadPool helps avoid the overhead associated with creating, starting, and stopping threads.
- Avoid using the ThreadPool if (1) the task runs for the lifetime of your application, (2) you need the thread to be a foreground thread, (3) you need to manipulate the thread priority, or (4) you need the thread to have a fixed identity (aborting, suspending, discovering).
- Thread class - Use this for long-running tasks and when you require features offered by a formal threading model, e.g., choosing between foreground and background threads, tweaking the thread priority, fine-grained control over thread execution, etc.
A LinkedIn Question
I regularly read and respond to relevant questions on LinkedIn, and below is an Access Db question I thought worthy of responding to.
Is it possible to get a Database job (MS Access) where i can work from home? I am a housewife and want to start my [career] initially by working from home. I have [a] Masters Degree in Economics and also good knowledge of the MS Office Package.
I worked from home with some major telecommunications companies for 7 months, building an Access Db for use in an operations area. I enjoyed the time at home, but I went back to the office because I needed the excitement and pressure of the corporate environment, even though the higher rates for corporate were consumed dressing, eating and commuting.
My feeling is that becoming a developer is fairly complicated if it isn't your undergrad or education.
- Before I even started, I had had programming courses during the 80's in COBOL, PL/I, and Basic, as well as Pascal (Don't laugh. A for loop is still a for loop, and an if statement is still an if statement)
- I had worked in technology for 10 years, throughout the 90's, using my basic programming skills scripting various desktop activities, installations. registry modifications, etc.
- I picked up Excel VBA in 2000, and subsequently Access VBA, while managing projects. I found it useful for transforming text into useable information
- More project management in 2003 lead to more programming in Access to massage and filter mountains of data
- More projects in 2004 and 2005, and much reading (ListMania, with a small selection of the books I've read and recommend, although I've read many more: http://www.amazon.com/gp/richpub/listmania/byauthor/A1O7TGVTV0X9NG)
- Additional work in SQL and database design is required to work with and understand Access
Nowadays I spend my days coding VBA (Excel and Access, .NET (C# and VB.NET), as well as T-SQL for Sql Server.
Although I would not phrase it as harshly as [others], you should be at least be better then the majority of people selling their services before considering being a consultant, both for your clients and for yourself. I am more inclined make evolutionary changes into different careers, leveraging prior experience, all the while trying to maintain a sense of excellence about my work.
My desktop programming and project management was extended by using Excel VBA, then later Access development and SQL, with a primary focus on project management, while leveraging my desktop support knowledge. Later I began working in just Access/Excel Development with SQL, with little project management, then later extending my Access/Excel VBA skills into C# and VB.NET. Clients were typically paying for my expertise in one area, while I was developing skills in another.
With a Masters degree in economics, it might make more sense to set your goals on a data analyst job which might use Access, along with SQL and maybe some other Business intelligence (BI) tools. Particularly with your background in math and statistics, this might make more sense. You can then think of transitioning to MS Access and SQL development, as well as doing more programming.
Have you taken any on-line tests of ability? I use Brainbench to gauge my comparative ability, but there are tools offered by others. It can be both a way of comparing and learning.
Recent Code Additions
Linked below is code using both C# and VBA, implementing variety of ideas, a threaded execution array, COM automation with C#/VBA, a VBA function, usuable as a worksheet function for array calculation, and WinForm code to retrieve information from the Outlook Address Book:
A short article appeared on 37 Signals, How Do I Learn to Program
that I responded to:
For myself, it started with something like passion, but more like a love of problem solving. My father worked with military computers in the 50's, and tried teaching me binary numbers in the late 60's - i was around 7 or 8 - so i was accustomed to prods to work in computers. It wasn't until the early 80's that I took a BASIC course. I loved it, but still floundered around for a decade, first in college learning COBOL and PL/I, later doing desktop/server support, and project management, all the time using some kind of scripting for work. In early 2000, I needed to script text transformation and decided to use VBA. Well VBA, later involved SQL, PHP, and now .NET. And it still involves a love of problem-solving, finding the perfect answer.
As for learning new languages:
Sometimes passion will get you over the hump, and get you working long days, with no reward and little food. I wouldn't count on it though.
Keep your goals small, especially if you are a novice. Avoid overreaching.
Convert code you know into one you want to learn. Transitioning from VBA to VB.NET, I would drop a VBA class into VB.NET and then fix it so it would compile. Just try and keep your transition in small steps.
Pick a task, and work toward it. Plan and build small website, but focus on good design principles and best practices.
- Certainly read, but maybe read and understand, without doing anything thing at first until you have a better grasp. Reading and implementing has its benefits as well, since you might be more likely to retain the information.
- If you feel you must, just do it. Expect to make mistakes, if for no other reason to handle the inevitable feelings of incompetence.
Exposing C# to VBA via COM
I recently toyed with creating a C# project that can be used to expose methods to VBA via COM automation. The project implements one form for printing, but otherwise has no UI. Included is a text file showing how to access and execute the methods via VBA. The project includes multiple methods:
C-Sharp form, printing and formatting options
C-Sharp OLAP connection-related methods
- void PrintFormat()
- void PreviewThis()
- string SelectFolder(string defaultPath)
C-Sharp refresh pivot methods
- string GetDbsToString(string server)
- bool VerifyConnection(string server, string db, string cube)
- bool ModifyConnections(string server, string db, string cube)
- void RefreshWorkbookPivots(_Workbook workbook)
- void RefreshThisPivot(PivotTable table)
- void RefreshSheetPivots(_Worksheet worksheet)
The zip file containing the source code and a text file with VBA to access the COM automation
is available from one of my other sites.
I recently found out that a basic command, ENVIRON, used to get system variables no longer works in Excel 2010, prompting me to research all of the deprecated features in the upcoming release.
Two articles from the MSDN blog:
Article on General Excel Features
Article on Excel Charting Features
Avoiding COM-Related Memory Leaks with Excel
Recently I was handed a combined WCF service and Excel 2007 add-in done in C#. project. Although most .NET developers assume garbage collection is handled automatically, it is definitely not true when working with COM objects assume, particularly Office objects. Below is some distilled knowledge when working with Excel (Office) COM object:
- Avoid more than a single dot when setting references to COM objects
- Avoid using references that can not be released explicitly
- Call FinalReleaseAndNull on all COM objects when finished and as soon as feasible
- Call Garbage Collection (GC) twice at the end of a routine
- Avoid passing objects in such a way it could prevent GC, where objects are passed and become part of other process and objects, becoming unrecoverable.
Updated Design Patterns
I uploaded an updated list of design patterns
covering ideas laid out by the GoF, Code Complete, Pattern-Oriented Software Architecture, and Patterns of Enterprise Application Architecture.
.NET Naming Conventions
Uploaded a distilled version of .NET naming conventions
, simplifying standards for use.
Pivot Table Compatibility
To preserve pivot table compatibility, short of writing fairly complicated code, you need to use a 2003 file format, so you can work in both Excel 2007 (Version 12) and 2003 (Version 10). Below are quotes from an MSDN blog explaining why, as well as links to two (2) pages detailing related issues.
Strategies for sharing PivotTables with other users
As noted above, version 12 PivotTables are not downgraded to version 10 PivotTables and will not be refreshable in previous versions of Excel. If you wish to share PivotTables with people using a previous version of Excel AND they have a need to refresh the PivotTables, you will need to ensure that these PivotTables were created as version 10 PivotTables.
How do I create a version 10 PivotTable in Excel 12?
The simplest way to do this is by using compatibility Mode. If you start with a new file, save it to the Excel 97-2003 file format and re-open the file, you will enter compatibility Mode. Any PivotTable that you create while in compatibility mode will be a version 10 PivotTable and will be refreshable when opened in previous versions of Excel.
Compatibility mode in the 2007 Office system
Updated the site today, expanding the types of code shared to include DotNET, both C# and VB.NET, and website. I removed the X/HTML reference from the title, since I typically do not develop that kind of code, and is tertiary skill.
Top Six Competencies that Predict Star Performance
I've often posted these bullets over my workspace, as a reminder of ways to maintain high performance. Not that I don't possess these qualities, but it pays to refocus from time to time, to maintain a high level of competence.
- The drive to continually improve performance. These people measure how well they do, search for ways to improve outcomes, set challenging goals, and innovate.
- They are impactful: they can make persuasive arguments based on hard fact, they know how to tailor a presentation to their audience, and they are concerned about their own or their organization's reputation.
- Conceptual thinking: they identify underlying problems and address them, they recognize the key actions that will make a difference, and they spot patterns that matter and make essential connections.
- Analysis: They anticipate obstacles, break problems down systematically, see consequences and implications within a system, and draw logical conclusions.
Initiative: they are persistent in tackling problems, and take on a challenge or solve a problem on their own before being asked to do so.
- Self-confident: they trust their judgment, seek out challenges, and operate best when given independence.
* Please note that this was taken from a freely-available online article, a link for which I can not find.
OOP and Object Permanence
Many of the benefits with OOP can be applied to other design styles, e.g., design patterns (conceptual, not GOF) and reusability, but for me the differences have been primarily in object permanence and code quality.
I recently read a critic of OOP, and the point that I would agree from his article is that OOP provides a clearer set of rules, with the end result better code and the ability to program complex concepts more easily.
Just some quick ideas:
- Procedural code can be quite messy with a tendency to sprawl, something that encapsulation and properties reduce.
- Writing with OOP, one better defines responsibilities, reducing coupling and the problems that ensue from intertwined code.
- State and messaging are improved as procedures generally only return a single item or object, and generally have no state.
- Procedural programming does not lack for reuse, and one of my concerns when I began writing classes instead of procedures was whether I would be losing some flexibility and some reuse
- Although the GOF are known for design patterns, generally, I think procedural languages could easily have their own set of patterns, albeit not class-oriented.
Myer-Briggs Type Inventory
The MBTI is a questionable measure, although I like it myself. Ten years ago I consistently tested as an ENFJ, but at some point I made a life choice to be more analytical, hence, the change from E to I, and from F to T. Nowadays, I consistently test as INTJ, although the I is just over the midline into I, and the T is just over the midline into T.
As a developer, I love the more abstract (N) concepts, such as design patterns, and eschew concrete how-to's (S), but my work-oriented websites would indicate I'm ISTJ, while my blogs and personal sites would indicate I'm INTJ. This made me wonder if the demands of work result in the IT personnel tests skewing to ISTJ. Rather than tech people being unable to change, I'd guess that the world of work, e.g., business people doing the hiring, allows limited change and requires a particular set of traits.
The website used for the Type Analysis of sites:
The Value of Recruiters
A recent article questioned the values of recruiters, since many found them to be a worthless, hostility caused by too many dead-ends, unreturned phone calls, etc. For myself, recruiters, and the social networks that sometimes bring them to me, are a necessity. I am software developer, primarily for the financial industry in VBA for MS Excel and Access, and secondarily with other languages like C#, VB.NET, ASP.NET, HTML and PHP. I find the job market is pretty hungry for my skills, and my current spot was acquired through LinkedIn.
I have worked with websites for over 10 years, professionally and personally, and see them as a kind of marketing tool; part of one of my business school application essays was a website. I have always maintained a strong profile on the job sites, as well as maintained my own personal/professional sites, so when social networking came along, it became another avenue for development, and I naturally 'groom' my web presence. But, what works for me, a person that typically gets calls from many recruiters via the job sites, the professional social networking sites are just another step. For others, who don't have the luck, interest, inclination, or ability to pull in contacts via the web, those same sites are a waste of time.
I do not get many calls from people that I network with, except those from former clients, although the recruiters that find me might, either because another developer passed my contact info on - I recently was introduced to a recruiter on LinkedIn via the HR head of a firm I interviewed for but was not hired by - or because they have a 'in' with the hiring manager. I was joking to myself yesterday, that the recruiters are my professional networkers, the people paid to do all the social stuff I don't.
I generally hate the belief, that people are unmotivated because the work "was too easy/not challenging enough," since the ease of something probably bears little relationship to its motivation. Yes, some people like challenge, myself included, but motivation often stems from seeing the goal as having worth, either to one's growth, or to some larger goal. Challenge can be good, provided one believes there is a real solution, or that one's efforts would be rewarded, but motivation always matters more.
Motivated people don't see challenges, they see opportunities, but you have to get to the point where the task is rewarding. Even then, real people have broader needs than simply goal orientation. Typical motivations:
- Money (not my favorite, but a favorite cliche)
- Social obligation (make other people happy)
- New technology (some thrive on novelty)
- Work with friends
- Advance one's skills
Although this might seem common, it is probably forgotten, particularly in the world of men that coders inhabit, that feelings matter.
The internet and computer use can drive people to multitask which is a great time and productivity waster, as well as reducing the intellectual drill-down required of intellectual thought. As a software developer, the ability to focus for long periods on single aspects is important, as is the ability to think about systems and to choose the correct solution. The focus on internet media, as well as the supposed tendency for youth to multitask, seems more harmful than good, at least in terms of smarts, or at a minimum, productivity. It's not that something positive can't arise from a reduced multi-target focus, but...
Multi-waiting vs Multitasking
I generally try to limit multitasking, but there are times it is more productive, when the wait time involved in the task is greater than then the time lost to switching.
Anyway, the issue with multitasking is that there is time lost in switching tasks, e.g., time to recover one's train of thought, and that needs to be balanced against the wait time involved in the tasks switched between.
The (Limited) Value of Networking
While I don't support wasting time and energy on worthless activities, I can't vouch for ever having found the recommendation to research to be of any value. I am definitely 'promiscuous' with recruiters, and not a believer in networking, although I have a fairly high profile in LinkedIn.
Unlike most people that have a job that many people are qualified for, I am a niche contractor, not looking for permanent work. I work in a common language, or a set of tools, that few people specialize in, with a background at major corporations. When someone needs one of my specialties, I'll get calls and emails from 5 different recruiters. My specialization is also why I can never commit to one recruiter; when I need work, they don't have openings, but someone else does. And I have had repeat engagements from different clients, and with different recruiters. They find me jobs, take a percentage of the rate, and if the engagement is long-lasting, they barely have to milk the cow, just take the cream.
For extroverts, and those looking to sell, networking might be wonderful, but I'm nauseous of the promotion of networking. People find me engaging, well spoken, helpful, and I can certainly self-promote, but I'm not that interested in other people: I'm kind of half introvert/extrovert. I enjoy people, but I also enjoy my alone time, with my wife, my music, my books, and my fitness activities. Most people are extroverts, and likely, most people are not finding much help networking.
My primary focus, has been getting recruiters to see my resume on line. Right or wrong, I think recruiters look at people as a very narrow skillet and they have to fill/exceed the requirements, particularly in bad times. I keep my headline clear and simple. When searching I update my resume daily, or a few times a week. I focus on detailing my technical skills, not exaggerating my experience, or pumping it up with filler.
In truth, it's been a little bit of luck - I have a desired, niche skill set in technology, with a strong background at major financials - as well as being able to capitalize on my niche market.
New Web Service
On my CodeDotNet.com
site, I added several new features, in particular a web service to return the system log as XML
. It is parameterized to return all entries, page hits, or only errors. My next step is to provide the data as Excel/CSV downloads - a common aspect of what I do- as well as build a system for collecting ideas and code snippets.
Access 2007 is supposedly easier for the basic user and tough for the power user. My applications built for 2002/2003 work without issue on 2007, the only issues being the default disabling of code, which once worked through aren't a continued hindrance, and the changes in screen real estate, which harm the form presentation a bit.
As for ADP vs MDB (with linked tables) vs MDB (with stored procs) , the last last option seems the best option, as it provides the best upgrade path for the client and the best security. In a recently-delivered Access with SQL Server (stored procs) application, performance was very good - an ADP was not an option as it was very slow - and since I separated the application forms from the data via a separate class, the app would be fairly easy to migrate to a VB.NET solution using either the web or a fat client. As for security, having no linked tables meant that users couldn't harm the data.
Still developing basic skills with .NET, I added an email form to my website, CodeDotNet
, using an eMail class, and code-behind in a form to send email. Additionally, the details of the email are added to my site log.
I am looking to write the same code in C#, not difficult, but there are some complexities using the 2 languages in one project, details that I need to resolve.
For my newish .NET site, CodeDotNet
, I added a form for performing Black-Scholes European option calculator, utilizing a VB.NET class, code behind, and an ASP.NET web page. Some interesting new techniques, at least for me, were built-in form validators, as well as postback, which maintained the web page values between submissions. If interested, please view it
. Comments are welcome.
I've been developing an interest in the .NET sphere, and to increase my abilities I bought a new domain, http://CodeDotNet.com
with which I will explore various techniques. My plans are to use the site to expand my skills by sharing ideas and techniques for .NET, combining an ASP.NET website, internet-based web services, a VB/C# desktop client and/or a widget to access selfsame web services.
SQL work makes up a large portion of my day, but in various forms. Much of my work has taken place using MS Access, building queries, equivalent to SQL views, to generate management reporting. When needed, which is often, I open and edit the view as SQL as need, or reuse SQL in ways to speed development by reuse of elements. The work for Access encompasses the basic commands of SQL, SELECT, DISTINCT, TOP, AS, DELETE, INSERT INTO, UNION/UNION ALL, GROUP BY (MAX, MIN, COUNT, SUM), ORDER BY, INNER/OUTER JOIN, WHERE, keys, functions, etc. The Access reporting I was hired to build as a temporary solution is after two years being migrated to SQL Server with little modification, encompassing hundreds of tables, and hundreds of views, as well as specialized scalar functions for age and value grouping. I think this speaks to my foresight, planning, and general understanding of good design.
More complex work is typically done using SQL Server, or any DBMS, writing stored procedures/functions/triggers and designing tables to maintain relationships and data integrity. I've provided samples in XLS files that show my basic work, and the following details work done exclusively by myself, and as it specifically applies to SQL:
1. For a Deloitte forensic accounting group, I converted an Excel workbook that used SAS file-based data to use SQL Server. The application was designed to make the accounting data drillable, in that forms would provide server data from which users could select fields and parameters, passing the selections to the stored procedures, and returning the recordset data to pivot charts. In that capacity, I created SQL procedures for loading data (Bulk Insert, Create/Alter Table), and providing interactivity via parameterized stored procedures (#TEMP TABLES, Dynamic SQL, Transactions, transaction audit/logging).
2. For my current client, Transaction Management Group, I was recently asked to build a SQL Server-based application for management, as a prototype of a system they would like to build, to enhance reporting for operational losses, which is named dbIRS Viewer. Although the front-ends are done in Access, the data is normalized on SQL Server. All the data is accessed and modified via stored procedures and views. Some of the SQL is fairly basic, e.g., parameterized SELECT, but a few are more complicated to meet the need for relational data presented in Excel; for that I've used cursors and temp tables to return multiple cells into one. Additionally, I've created functions to provide text value grouping, as well as triggers to update time stamp columns, that I use to prevent users overwriting each other's updates (Optimistic Locking).
As an aside, about 6 months ago, I interviewed for a hedge fund that tested my skills via PreVisor, but before the test I prepared and tested myself via Brainbench. At the time, I was only working in Access, and hadn't done hardcore SQL for about 2 years, but even then, I scored fairly well on a test of SQL (ANSI) Fundamentals. According to Brainbench, I "scored higher than 81% of all previous test takers. Demonstrates a clear understanding of many advanced concepts within this topic. Appears capable of mentoring others on most projects in this area." As for style, although the provided samples don't represent how my code is normally structured - they are downloads into Access tables - I code SQL as cleanly as I code VBA. Other developers are impressed and pleased with the readability of my VBA code, since it is structured, modular, well named, and if the code is long and multi-segmented, commented.
Uploaded an example of a class module used to interface between Access or Excel application and SQL Server
. It incorporates the typical elements of a class, enumeration, property get/set, and functions. It provides a persistent data object for use within an application which can be filtered and passed, as well as encapsulates invocation of stored procedures.
It encapsulates a specific implementation of SQL Server data, with numerous constants within the class, to separate the data interaction from the application. As for methods, it has numerous public methods for invoking stored procedures to retrieve, add, delete, and update records, filter/unfilter the recordset, and private methods for establishing/closing connections, verifying user access, and verifying the class version.
Updated this site coding, modifying the embedded blog aspect by creating code to limit each blog display to 500 characters, and enabling the code to link the display to the complete entry. The PHP was fairly simple, and it will allow me to go into greater detail on code entries, as well as expand my thoughts on issues relevant to programming.
Working with Access, I needed to calculate business days between dates, excluding holidays, and with an eventual conversion fo the DB to SQL Server, and rewrote the VBA into an SQL function
Uploaded a class module for zipping a file or files
with appropriate verification of inputs. Although downloadable from this site, my other site, accessed via link above, will maintain the freshest version .
Still uploading code elements and refining the front-end on a new wiki, VBA @ WikiDot
. In the past week I've created pages for various classes, functions, and XLA's that I've created and find particularly useful.
To facilitate maintaining updated code, I've created a wiki, VBA @ WikiDot
. One of the nuisances of maintaining this website is that providing updated versions of uploaded modules requires working directly with the back-end database, but the wiki will make it easier to directly update code.
Took a commonly shared bit of VBA code that sends (no prompt) a Notes e-mail, and modified it to be reusable as a function, to send a Notes e-mail from passed values for subject, recipient, body, and attachment
Uploaded an Access DB that queries the domain for objects
, e.g., user ID's, along with the related account fields. Requires that you modify code to specify a domain.
Uploaded an Access import class module
that encapasulates redundant import-related file functions for source selection, table creation, data staging, date marking, table append (staging to master), and cleanup.
Uploaded an Excel VBA module that changes the referenced database in data ranges
, provided that all the ranges point to the same DB. Instead of populating an XLS with data via ADO, I've lately used data ranges that are refreshed by pulling data from a DB, and the procedures I write refresh the ranges in the Excel object. The benefits of data ranges are that they more easily incorporate into formatted Excel pages.
Still at DB, and probably like many in the financial IT field, wondering if the looming recession will have an impact on the job market, or even my current situation. For myself, I see little signs of hiring letting up, but then again, I am not invested in searching for a new position.
Busy, busy, busy.
Building on the positive reviews of a recent application for unverified trades, I've been asked to create an application to report and/or settle positions for a equity derivatives group. Additionally, I've recently built an application to automate daily cash equity reporting, designing a DB to import numerous files and FX Rates, calculate USD values and count breaks, then output to Excel. Also, I overheard a person describing her need for something to automate reporting for a synthetic equity group, so I showed her 'model' I've been working on, and I have been asked to create something similar for that team.
Deutsche Bank, my current spot, has recently rolled out an unverified trade reporting application I've created for them in Access, and I'm finishing up on the front-ends for an automated reporting application, which aggregated data from numerous sources, ultimately presenting the break count and value in a standard Excel/PDF format.
Recent uploads include code for dynamically resizing forms (Jamie Software) based on current screen resolution, as well as code to backup all forms, queries, and modules in an Access MDB.
I was recently asked to step in on a project, since the current application had issues, in retrospect related to bad design and some communication problems. So far, my design changes and testing rigor have paid off, with no errors reported during the week. Hopefully I'll be able to hand this back and move on to working on my own designs.
I'm implementing something similar, a step up from the other desig's use of a shared Excel workbook tied to an Access database, but I'll be using an Access front-end for the interface, as well as using synchronization for keeping users data up to date and performance times down; the users are spread across the continents, and a more robust back-end is not available.
Uploaded a basic class module to provide the excel Workday function in Access.
Still at DB, and still developing a reporting tool, but moving toward smooting and locking down the code so that it can be run by business personnel. My current project work is shifting toward analytical tools and charting for the group's trade-related data.
I've been working intensely on developing a reporting tool for a trade risk management group at Deutsche Bank for the past two months, and although I have developed some interesting modules, I haven't uploaded any to the website. Expect to see a few updates over the next few days.
Uploaded two modules, one is a function that can be used to calculate target dates, taking into account a date and hours/minutes of work, compensating for weekends, but not holidays, and the other is the entire list of Excel 2003 constants and values as VBA global constants, useful when late-binding Excel in Access automation.
Some of my recent work with NCR has involved verifying user input, e.g., Excel worksheets, as well as Access elements, so I created and uploaded several functions to handle checking of columns tables, and queries.
I've improved upon the text-handling features of the code search, as well as enabled the use of "and" and "or" searches in the criteria. Additionally, the header image has been changed to a distorted photo of myself, instead of the New York skyline.
I've added a new feature to the website, the ability to search code descriptions
, which I will likely enhance, by expanding on the search capabilites. Also, although only of value for myself, I've improved the graphics used in the administration pages.
Uploaded modules to verify data format, specifically column headings, on user-supplied data from Excel.
After a long hiatus, my wedding and related honeymoon, I've returned to update this site. I am still working with NCR, developing a database to manage information between Cisco and ATT. In the near future, I expect to upload some recently develop modules.
Starting a new spot as a Senior Access/Excel Developer, this one remote and at least for the next 3 months, with The Bardess Group
I'm winding up my assignment for Deloitte, converting a XLW from using SAS file-based data to use SQL Server as well as improving the entire workbook. I've been told that I've given them a new application, since the original was both ugly and written terribly. Also, I've added new features that weren't in the requirements, and overall, created a faster, more functional tool for a forensic accounting team.
In addition to the code for backing up stored procedures to Access, I've uploaded an entire MDB, removing much of the work involved, so using this would require only modification to the connection string.
Uploaded two stored procedures that perform redundant function, one for creating crosstabs (pivots) in SQL and one that converts strings to tables. Both are written by others.
Today's upload is an Access module that builds a table of stored procedures. You define the database, as well as some local variables for output, and the code builds the table, if needed. The procedure then returns all the code within the database's stored procedures, which is then output to an HTML file, since the string lengths are too long for Excel.
Uploaded an module of autosave procedures, including code to add a dropdown menu for selection of times to a toolbar.
The demo of the chart drilldown for my client went well, and according to a colleague, the work itself is "on the top edge of Excel/Access capabilities." It certainly is unique. I'll be uploading the latest version today.
Also, my client has offered me an incentive bonus for finishing early, prorated by how many weeks early I can deliver the product to UAT.
I've uploaded 2 modules to enable drill down into pivot charts via Shift+Click.
Uploaded several new Excel modules covering logging, formatting, worksheet naming, and worksheet-focused illegal character replacement.
My current spot at Deloitte focuses on automating pivot tables and charts, as well as SQL stored procedures, and I expect to have some new shareable code uploaded soon.
The prior programmer's work was terrible - complex code written without comments, indentation, and defined variables - so, in addition to fixing the code, I've been able to modularize numerous redundant components, in particular ADO/SQL components.
My current spot at Deloitte is giving me the opportunity to develop classes for pivot chart/table events, as well as working with stored procedures.
Recently uploaded a module that exports all internal Access tables to Excel.
I'll be starting a new spot with a Deloitte forensic accounting group, a short-term spot doing Excel VBA. Also, this past week I've had interviews with two (2) hedge funds for permanent positions, both of which I am hopeful about hearing from.
Two XLA's were added, both were built several years ago by myself. One converts between dollars and euros - conversion value is acquired from the internet or set by the user - and the other adds 11 operations management statistical calculations to Excel, with explanations on usage provided via a drop-down menu.
Yesterday, I had an interview for an Excel spot, directed the interviewer to this site, then realized the paucity of Excel modules, so I've uploaded several files for parsing, formatting, documenting, and printing.
My portal site
has been modified to XHTML Strict compliance, which required greater use of CSS, illumnating the necessity of separating data from formatting for XML delivery.
Uploaded two modules recently, one a VB Script that loops through the domain to show login information, and a second that exports Access tables to files, strings of insert commands that will rebuild the data in Postgresql.
The site is now XHTML 1.0 Transitional compliant. The next step will be to increase the use of CSS and move the site to compliance with the Strict standard.
The W3C organization provides a very useful set of free utilities for learning and referencing X/HTML and CSS
I am migrating the underlying code to XHTML - it is currently HTML 4.01 Transitional compliant - prompting my use of the W3C material to make certain that I understand the standard thoroughly.
Updated underlying HTML to 4.01 compliance; the CSS was already 2.0 compliant. Added icons to the site's footer component with links to the W3C Markup Validation Service
Uploaded several modules to automate various actions in Access:
- Transpose Array (GetRows)
- SQL Execution and Logging
- E-Mail String Verification
- URL String Verification
- Export All Tables to Excel
On this site, I changed routines that set and used cookies to routines that used session variables, for better security and process flow. This idea will be expanded to include various system messages.
Several enhancements were made to the site:
- Created new logo for site header
- Improved administration graphics
- Index text has been improved
- Resume is database-based
- Uploaded most recent resume
In addition, I've upgraded my personal site
to use PHP and CSS.
I renamed the site to accurately reflect its content, to Code: VBA, PHP, X/HTML, CSS, since the site has expanded beyond VBA, partially driven by my interests in PHP.
After vacationing in the Southwest for the past week, I'm back with some clearer ideas of how to code PHP. My latest work has been beefing up security using cookies, as well as streamlining some administrative functions
I'm in the process of documenting the database I currently support, and instead of the original timeline of one month, I've been given a few days. Also, I was expecting to provide depth and detail, but the team receiving the application wants a "view from 40,000 feet." All-in-all, the process is going fine, and the target date, today, is really not the end-date, as I will be returning to UBS after my vacation.
A small personal note, I'll be spending the next week with my fiancee in the Southwest, staying in Las Vegas, Idyllwild, and Laguna Beach.
I've made an interesting change to the site today. I created a file that I include in each page that uses the name of the current page to display a different title. I plan on using the same technique to selectively show a different search bar for each of the blog, code, and sites pages.
Learning more PHP, it becomes more evident how I need to develop my OOP skills, particularly developing classes.
Good Code and Bad Interviews
I spent a large part of the day trying to program a BLOB insertion using PHP into a MySQL database, with the express purpose of enabling me to upload and share more VBA code, and in the process I've developed a much stronger understanding of PHP.
The Nomura interview I had seemed a wash; the technical interviewer took a disliking to me before he finished sitting down, and little I could say could sway him from his negative conclusion.
Still waiting to hear the final opinion on an interview at Morgan Stanley for a project management and senior management reporting spot. Everyone except the senior person loved me, particularly the people I was supposed to work with, the same people who performed the technical interview.
Although I've known how to create forms for years, this morning I expanded my forms knowledge to include inserting data into a MySQL server, allowing me to add to this site's blog without directly accessing the server tables.
This minor improvement has also made me aware of how much I need to plan, so that I can avoid issues of SQL injection, as well as other security risks.
Updates: Interview and Site
I have an interview at Nomura Securities today, for a VBA spot; Nomura is working towards a financial data warehouse, with a range of technologies, and I will likely be involved in a reporting function utilizing Excel/Access VBA.
Additionally, I've added a new module to the available list of code, for getting and evaluating the Windows UserID, as well as entered a link to a great Access-focused magazine, Access Advisor.
Learning PHP, HTML, and CSS
PHP, along with CSS, have greatly enhanced my ability to create a low-maintenance, consistent, modular website. CSS has been around for quite awhile, but I haven't bothered to learn it until recently - I don't earn my money making websites - and PHP has given me the ability to use my usual VBA data management skills, creating recordsets and setting to variables, with a web interface.
Just toying with PHP and MySQL, and enhanced this site, pulling database data with PHP. In the process, I've found wonderful ways to reduce website maintenance by using a more modular design.