Since October 2008, Ken is the co-host of the CodeCast podcast show. http://codemag.com/codecast.
Visual FoxPro Product Manager (Aug 6 2001 - March, 2006 )
Lately I’ve been working on a new project with my friend Ian Jacob. Together we co-host a new HubSpot focussed podcast called HubShots.
There’s 6 episodes available so far (and two more recorded and currently being edited).
If you’re interested in inbound marketing, content marketing and HubSpot, then I think you’ll really like the podcast. We’re aiming for 30 minutes or less for each episode, and include a bunch of action items in each as well – so there’s something useful you can try right away.
Would love to know what you think.
For software developers lots of screen real estate is important – it seems like there's never enough. Trying to see code, and multiple browser windows, debuggers and command windows all at once or at least in a way that you can find all these windows quickly is difficult if you don't have a ton of screen real estate lest you get into multi-finger acrobatics. Yeah, we've all done that. For the longest time I've fallen behind in my expansion of screen real estate – I've been stuck with a couple of 27" 1080p monitors (plus the laptop screen) for a looong time. I missed the WQHD/WQXGA era because it seemed like too little too late, when 4k was on the horizon. However it seems like it's taken a long time for 4k monitors to actually catch on and even longer for some decent sized 4k displays to become available.
A couple of weeks ago when I got back to Maui and my office (after 6 months on the mainland), I finally decided to jump in and buy a 4k monitor. But not just any monitor either but a freaking behemoth of a monitor that is the 40" Phillips BDM4065UC.
4k seems to me the logical next step for monitor resolutions, but a 4k monitor on anything smaller than a 30+ inch monitor is pointless, so I've been waiting for for larger models to show up. I discovered the Philips monitor before it was released in the US and was available only as an import and it looked tempting then. The thing that put me off initially was that it's relatively cheap compared to most other 4k monitors – it's in the same price ranger as mid to high end ~30" monitors which seems surprising given that this is one of the very few large 4k monitors out there. So, naturally I was skeptical and with lack of reviews at the time I decided to hold off.
In the last 6 months I've checked the reviews again and talked to a few people that bought these and said they were good – not great but good. Since I've never owned a super high end monitor I figured I can live with purist limitations and decided to get it.
After a week and half with this beast I can tell you one thing: There's no way I'm going back to a smaller monitor!
This monitor is very large compared to the 27" displays I've been using in my office. Just to give you and idea, here is the monitor with one of the old 27" monitors on the right and the 15" MacBook Pro on the left. Look how puny the 27" looks compared to the Philips.
Yes it's a behemoth. When the box showed up at the door it was a definite OMG moment! Sylvia said it must be some mistake on the delivery and now she's worried I might never emerge from my office again :-). Once set up the monitor barely fit underneath the mounted speakers…
When I sat down in front of the monitor for the first time I definitely thought: This is going to be too freaking large. I felt like being at a tennis match in the front row. You definitely have to turn your head to see each end of the monitors edges :-)
But surprisingly after a day or so of use the monitor no longer feels massive, but rather it feels – just right. It takes a little getting used to, in terms of figuring out how to place your windows for maximum efficiency to access them and to put the content you're working at the right eye level. The screen real estate is amazing. 4k is essentially four 1080p monitors and that is a lot of space. Making the most of all this space takes some experimenting – I like to layer my windows so that part of every window is always visible so it's easy to get to each open window and with 4k of space it's very easy to keep a lot of stuff open and accessible.
40" is big enough so you can run the monitor in its native 100% resolution without any scaling required from Windows or the Mac. I run Windows at 100% and the Mac in smallest scaled size it can do and while it's a little bit on the small side it's totally doable. I'd say it's probably equivalent of what you would get with a 24" display at 1080p.
To give you an idea of screen size consider this screen shot of using Visual Studio with 3 edit windows open simultaneously plus a document and test view, plus a full screen browser with the Dev Tools open:
Everything you need on one screen!
The real kicker here is the vertical resolution – if you want to see a lot of lines of code in a single page, this is a pure joy to get over 2k vertical pixel height When you're heads down working this setup is pretty sweet with Code, HTML, CSS all open in a single view, plus a code search, active browser and browser dev tools. It's pretty damn productive when everything is right there without flipping between different windows or monitors.
Another really cool use for all that screen real estate for me has been running my music recording rig. I use LogicProX on the Mac and running a DAW at 4k is simply amazing.
I can see all my tracks, plus the full track mixer plus a number of bus views and plug ins I'm actively working on in a single view. When I'm actually recording I can see the whole track while it's running which provides some useful visual feedback.
In short having this much screen real estate is just awesome. But what's really scary is that now going back to the 1080p display to do anything feels like a an 800x600 display of old. It's going to be hard going back to smaller resolutions once you get used to this much screen real estate.
Given that this is a relative cheap monitor for this size, the monitor is pretty nice. Yes it's missing some amenities, but the things that really matter for developers are all there and working.
There's lots to like:
Note that in order to get the monitor to run at 60hz, which is a requirement if you want to run it at native resolution so you don't get severe mouse lag, you have to configure the monitor explicitly via the on screen menus. Those menus are a bit tricky to work at first – it's a funky joy stick at the back. The DisplayPort configuration is in the Setup section of the onscreen menu on the bottom.
It's a mystery why they would ship this thing with DisplayPort 1.1 support enabled by default when you can't really get good enough screen performance to run it at native resolution. You definitely need DisplayPort 1.2 to use this monitor effectively so make sure you have a video card that supports this.
I'm using the current 15" MacBook Pro with the monitor and it works great. You'll need a mini DP to full DP cable which is not included in the box to hook up a laptop. There are a host of cables that come with the monitor including a full size DisplayPort cable, but no mini to full DP cable.
You will also need a video card that actually supports 4k video output. Most recent video cards on higher end laptops and most reasonably recent dedicated GPUs should support 4k and DisplayPort 1.2 but be sure to check first.
Driving this much screen requires a lot of horsepower and I have noticed that the GPU is working pretty hard and forcing the MacBook fan to run a lot more than it did before. I also noticed that while running in Parallels, the mouse is not quite as smooth as it used to be. However, in native Windows (Bootcamp) or native Mac there's no problem. You do want to bump the mouse pointer sensitivity nearly as high as it will go so you can get around all of this screen real estate. Any small hiccup in the mouse software or wireless connectivity is noticeable – I'm considering getting a wired mouse to avoid these disconnects.
As I mentioned early, when I did some research on this monitor the reviews were good but not exactly glowing. The bottom line is that this is a good monitor, but it's not a competitor in the top of the line camp for monitors. This is not an IPS monitor so while the screen is super sharp, the color gamut is average at best. Even playing around with the color settings on the monitor and in the OS gives decent but slightly washed out colors. I settled on the standard SRGB settings which are not customizable at the monitor level with some gamma tweaking in the video settings for the video card to make colors pop a little better. This isn't to say the colors are bad, but compared to high end displays this monitor is not a contender.
The other issue are viewing angles. Because the monitor is absolutely massive this actually matter a lot more than other monitors because you are actually affected by viewing angles sitting directly in front of the monitor. I've had issues with things on the very bottom of the screen – like Windows Taskbar highlights being difficult to see because they are so small. If you are really close to the monitor and looking down the bottom edge starts disappearing. The higher you sit the more noticeable this problem becomes. It's a minor thing that could be easily fixed if the monitor had veritcal adjustment so the image could be moved up a touch, but in certain color profiles you can't adjust the image position.
Because the monitor is so big, I also noticed that there are a few uneven spots in the display. This is not a problem if you sit right in front of it but from various angles you see these uneven spots as slightly shaded/discolored.
The monitor comes on a fixed stand – there's no adjustment for height or angle. On the plus side the stand is an open metal frame that leaves room underneath the monitor so you can store stuff underneath the monitor.
None of these are deal breakers, and given the price of the monitor this is what you would expect.
To me this is the right size for a monitor because it's the size that is borderline too big, but can display an enormous amount of pixels at native resolution. I think this is as big of a monitor that you can comfortably use sitting right in front of, so I don't foresee much bigger monitors coming along in the future and getting much traction. Some would say that this is too big, but I think this is pretty close to the sweet spot for 4k displays. It's big, but it doesn't feel too big. I also think that even higher resolutions aren't going to matter all that much for monitors because this monitor's resolution is already ultra sharp – anything higher and we're just going to start scaling the screen down which seems pointless. So personally I think 4k seems like a sweet spot with 30"+ size monitors.
I'm rather surprised that there are a so few bigger size monitors out there. To date there are only very few and most of the other ones are a lot more expensive. I think this will change eventually once more people use these behemoths and they become more common. If you're a developer, once you see one of these, or better yet you've had a chance to work on one for a few hours you'll probably realize very quickly how productive it is to have all this screen real estate.
The Philips is a decent monitor and great deal for the price. It's bare bones but it gets the most important job done effectively.
There's no going back for me.
I've been building a number of solutions lately that relied heavily on parsing text. One thing that seems to come up repeatedly is the need to split strings but making sure that certain string tokens are excluded. For example, a recent MarkDown parser I've built for Help Builder needs to make sure it first excludes all code snippets, then performs standard parsing then puts the code snippets back for custom parsing.
Another scenario is when Help Builder imports .NET classes and it has to deal with generic parameters. Typically parameters are parsed via commas to separate them, but .NET generics may add commas as part of generic parameter lists.
Both of those scenarios require that code be parsed by first pulling out a token from a string and replacing it with a placeholder, then performing some other operation and then putting the the original value back.
For me this has become common enough that I decided I could really use a couple helpers for this. Here are two functions that help with this:
************************************************************************ * TokenizeString **************************************** *** Function: Tokenizes a string based on an extraction string and *** returns the tokens as a collection. *** Assume: Pass the source string by reference to update it *** with token delimiters. *** Extraction is done with case insensitivity *** Pass: @lcSource - Source string - pass by reference *** lcStart - Extract start string *** lcEnd - Extract End String *** lcDelimiter - Delimiter embedded into string *** #@# (default) produces: *** #@#<sequence Number>#@# *** Return: Collection of tokens ************************************************************************ FUNCTION TokenizeString(lcSource,lcStart,lcEnd,lcDelimiter) LOCAL loTokens, lcExtract IF EMPTY(lcDelimiter) lcDelimiter = "#@#" ENDIF loTokens = CREATEOBJECT("Collection") lnX = 1 DO WHILE .T. lcExtract = STREXTRACT(lcSource,"<",">",1,1+4) IF EMPTY(lcExtract) EXIT ENDIF loTokens.Add(lcExtract) lcSource = STRTRAN(lcSource,lcExtract,lcDelimiter + TRANSFORM(lnx) + lcDelimiter) lnx = lnx + 1 ENDDO RETURN loTokens ENDFUNC * TokenizeString ************************************************************************ * DetokenizeString **************************************** *** Function: Detokenizes an individual value of the string *** Assume: *** Pass: lcString - Value that contains a token *** loTokens - Collection of tokens *** lcDelimiter - Delimiter for token id *** Return: detokenized string or original value if no token ************************************************************************ FUNCTION DetokenizeString(lcString,loTokens,lcDelimiter) LOCAL lnId, loTokens as Collection IF EMPTY(lcDelimiter) lcDelimiter = "#@#" ENDIF DO WHILE .T. lnId = VAL(STREXTRACT(lcString,lcDelimiter,lcDelimiter)) IF lnId < 1 EXIT ENDIF lcString = STRTRAN(lcString,lcDelimiter + TRANSFORM(lnId) + lcDelimiter,loTokens.Item(lnId)) ENDDO RETURN lcString ENDFUNC * DetokenizeString
TokenizeString() basically picks out anything between one or more start and end delimiter and returns a collection of these values (tokens). If you pass the source string in by reference the source is modified to embed token place holders into the the passed string replacing the extracted values.
You can then use DetokenizeString() to detokenize either individual string values or the entire tokenized string.
This allows you to basically work on the string without the tokenized values contained in it which can be useful if the tokenized text requires separate processing or interferes with the string processing of the original string.
Here's an example of the comma delimited list of parameters I mentioned above. Assum I have a list of comma delimited parameters that needs to be parsed:
DO wwutils CLEAR lcParameters = "IEnumerable<Field,bool> List, Field field, List<Field,int> fieldList" ? "Original: " ? lcParameters ? *** Creates tokens in the lcSource String and returns a collection of the *** tokens. loTokens = TokenizeString(@lcParameters,"<",">") ? lcParameters * IEnumerable#@#1#@# List, Field field, List#@#2#@# fieldList FOR lnX = 1 TO loTokens.Count ? loTokens[lnX] ENDFOR ? ? "Tokenized string: " + lcParameters ? ? "Parsed parameters:" *** Now parse the parameters lnCount = ALINES(laParms,lcParameters,",") FOR lnX = 1 TO lnCount *** Detokenize indvidual parameters laParms[lnX] = DetokenizeString(laParms[lnX],loTokens) ? laParms[lnX] ENDFOR ? ? "Detokenized String (should be same as original):" *** or you can detokenize the entire string at once ? DetokenizeString(lcParameters,loTokens)
IEnumerable<Field,bool> List, Field field, List<Field,int> fieldList
Notice that this list contains generic parameters embedded in the < > brackets so I can't just run ALINES() on this list. The following code strips out the generic parameters first, then parses the list then adds the token back in.
This isn't the sort of thing you run into all the time, but for me it's been surprisingly frequent that I've had to do stuff like this and while this isn't terribly difficult to do manually, it's very verbose code that is shrunk to a couple of simple helper functions. Maybe some of you will find this useful thoughâ€¦
I was just poking around in my PATH environment variables as my machine is a relatively new install. To my surprise I found this new Path Environment Editor in Windows 10 Update 1:
A real editor for editing environment variables? And a way to manage the 50 or so paths I usually end up with in my SET PATH? Hell yeah… it only took 30 years for Windows to do such a simple thing.
It's sad, but this is exciting. How many times have you taken the path string out of the old editor and paste it into an editor just so you can read the freaking path, let alone edit it on a single line? Well, this is a welcome if long overdue change.
When you click on any environment variable you now also get a window that pops up that optionally lets you select a directory or file path:
Also notice that all of these windows are resizable which is another useful feature for seeing more of your system vars more easily.
I hope we see more of this sort of thing in future Windows updates. Little things like this make life easier for many medial tasks and there are plenty of related things in the ancient Windows 3.x dated dialogs that can be improved in similar ways. Better late than never and I for one appreciate that!
Although Open Data is around since several years in other countries and has been initiated in Mauritius already back in 2012, it is only this year that there seems to be more momentum towards an Open Data initiative. Back in May 2015 I was kindly contacted by Alla Morrison, Program Officer at the World Bank, in regards of showing genuine interest in open government data for members of the Mauritius Software Craftsmanship Community (MSCC).
As founder of a local IT community and representative of more than 250 software craftsmen I was positively surprised by this, and the message was circulated immediately. During the second week of June, the Open Data team of the World Bank held several public sessions. Actually, I managed to attend two sessions relevant for developers.
The first event was conducted at the Prime Ministers Office in Port Louis and focused mainly on the ideas, concepts and benefits of Open Data in general. The given use cases and success stories around Open Data were impressive, and it was very interesting to see that solid solutions can be provided by anyone interested to solve a specific problem.
Attendees of various IT user groups and communities in Mauritius at the workshop with the World Bank on Open Data in Mauritius
Delegation of the World Bank during the workshop with local IT user groups
During the second get-together, which was more like a workshop, the team wanted to know exactly what kind of open government data and datasets would be of interest for IT folks here on the island. Based on our requests and the talks to the various ministries in Mauritius the team at the World Bank conducted their Open Data Readiness Assessment (ODRA).
Fast-forward to end of October, the Open Data team of the World Bank completed their assessment, and members of the MSCC and other organisations were again invited to receive information about the findings and the suggestions first-hand. This time the event was held at the Rajiv Gandhi Science Centre in Bell Village, and surprisingly there were fewer IT people this time. Nonetheless, it felt a bit like a press conference and taking notes as well as pictures during the various presentation had a touch of journalism... Even though I'm quite late on blogging about the topics I got a lot of answers to my questions and the general outcome of the assessment for Mauritius is positive. Surely, there are areas of improvement but overall it looks very promising for us developers to get our hands on open government data soon.
Other attendees like Ish and SM published their thoughts already earlier, so I won't repeat myself with those details but just give you a brief summary on the topics I'd be most interested in within the next couple of months. Frankly, here is what I asked Alla upfront via email:
"I started with the preparations of the Developers Conference 2016 - http://www.devconmru.org/ - recently, and I'd like to see whether it would be possible to have access to any datasets of Open Data in April/May next year in order to schedule a hackathon or app challenge during the conference days."
And the signs within the government of Mauritius are looking good based on the findings of the World Bank and seeing the increased commitment of the Ministry of Technology, Communication and Innovation (MTCI) makes room for a solid corporation and platform of exchange. Currently, there are 15 data-sets of Open Data already available from the Mauritian Government in a machine-readable format and properly licensed. An Open Data committee will be in place soon. Various activities to provide and promote Open Data are already planned.
Explanation of findings in the Open Data Readiness Assessment and Q&A session
In regards to provide access to open government data the World Bank has implemented a 5* rating of Open Data Formats. The Mauritian Government is on a firm way to a 3* rating as some datasets are already available in machine-readable, neutral open formats: CSV, XML, JSON, etc. First, publish "as-is" and then engage with the dataset users in order to improve the quality of information and optional the format(s) over time. Later on, adapt international-recognised data exchange formats based on the domain of information.
General public access and free available data sets of open government data carry these attributes forward:
The team of the World Bank also reported about GDP improvements in other countries between 0.4% and up to 4% per annum by giving people access to Open Data.
Mauritius is already well placed to implement an Open Data initiative. The government has commitment to provide open data and there is a strong demand by developers, private sector and researchers. Technically, the government already has a good fundament to publish statistical data in an open data format
Working towards an Open Data Portal could / should be based on leadership by the MTCI; each ministry should opt for an "implementation cell" working closely with a Chief Data Officer (CDO) and the users of Open Data. The development of detailed policies "open by default" and exceptions, licensing, changes to charges and schedules is inevitable. Clear strategies for audience growth and users engagement are recommended within a short period. Also, the assessment suggests a release of so-called "quick win" datasets onto the existing portal early and the stimulation of hackathons or app challenges organised together with local user groups.
Datasets that could be made open data quickly by the government of Mauritius
Those datasets could be ready in a couple of months and usable for any kind of coding challenges and hackathons. Crossing fingers that it will be. Of course, there are more datasets of interest and it will be our responsibility to ask for such information in a healthy dialog with the corresponding public bodies, mainly the Ministry of TCI.
The findings of the ODRA suggests to the government that any kind of data should be "Open by default" including clear definition of restrictions to restricted and sensitive data. Obviously, personalised information have to be anonymised by the ministries prior to grant general access. The necessary competence does already exist within the Ministry of TCI according to the commitee of the World Bank. Additional training among ministries could be conducted and a general guideline for all institutions could be defined, too.
The Open Data Readiness Assessment differentiates between access to information and access to Open Data
The recommended license for Open Data is based on international best practice:
Get free access to data, be able to share it freely, just give proper attribution of the source of the dataset. Existing data is to be included in the Open Data initiative even though it might have been "published" previously under closed data formats or inconvenient licensing. Sounds pretty good actually.
Let your users become your advocates... clear recommendation by the World Bank towards the Mauritian Government to provide open data and datasets in the right format(s) and to clarify the various demands by the private sector Strong encouragement towards hackathons and pro-active advertisement/notification of the OD portal and activities on OD datasets.
What about "Code for Mauritius"? Idea for a new community around Open Data?
The findings of the readiness assessment of the World Bank were informative but brief. The full report is with the government and will be presented to the Cabinet during the next couple of weeks. Hopefully it will be published publicly in the near future. As for the MSCC, I had several conversations with key persons at the Ministry of TCI and the National Computer Board about ways of how Open Government Data could be helpful for future activities of the MSCC and how we can improve the dialog with public bodies in regards of more transparency and nourishment of local IT talent. The samples of success story from other countries were really inspiring and I'm very confident that similar results can be produced here in Mauritius. Given that data-sets will be available for free and under the right license anything is possible.
Note: This post is based on ASP.NET 5 Beta8, the samples might not work for you anymore
Docker has becoming a popular way of hosting Linux based applications the last few years. With Windows Server 2016 Docker containers are also coming to Windows and are likely going to be a popular way of hosting there as well. But as Windows support is just in CPT form right now and ASP.NET is also moving to Linux with the CoreCLR I decided to try and run an ASP.NET web site on a Linux based container.
To get started I installed the ASP.NET 5 beta 8 using these steps. Once these where installed I created a new ASP.NET MVC project. Just to show where the application is running I added a bit of code to the About page. When running from Visual Studio 2015 using IIS Express this looks like this:
The extra code in the HomeController:
With the following Razor view:
So far so good and we can see the application runs just fine on Windows using the full CLR. From Visual Studio we can also run the application using Kestrel with the CoreCLR as shown below.
Having installed the Docker-Machine we can also run this same website in a Docker container on Linux. The first thing we need to do is create a Dockerfile with the following contents:
With this container definition in place we need to build the Docker container itself using the docker build -t web-application-3 . command. This is executed from the Docker terminal window in the main folder of our web application where the Dockerfile is located. This will build our Docker image which we can now see see when running docker images.
With this new image we can run it using: docker run -d -p 8080:80 web-application-3
With the application running we can navigate to http://192.168.99.100:8080/Home/About where 192.168.99.100 is the IP address of the virtual machine running the Linux machine with the Docker daemon.
? SQLExec("SELECT * FROM Customer WHERE Customer.CustID=?pCustID")
It's been a fun, but very busy week for me at Southwest Fox last week with 2 very long days of Web Connection training and then 3 more days of sessions at Southwest Fox.
Anyway, if you were attending the conference or not, here are the links the for the materials for the main Southwest Fox sessions which are hosted in BitBucket repositories.
This session demonstrated how to build SOAP 1.x based Web Services, using .NET ASMX services as the intermediary to both create Web Services and also consume them using .NET. For the server side the samples demonstrate how to use OleDb for direct data access to FoxPro data, as well as using MTDLL COM objects for calling FoxPro business logic to return objects for consumption by .NET code. Both styles rely on some connecting .NET code to provide the service front end. For the client side the examples, demonstrate importing a Web Service from WSDL and generating a .NET class, then using wwDotnetBridge to call the generated .NET proxy.
During the two day training we focused on server side application development during the first day of the training. The example is a small time tracking application that is mobile friendly using Responsive Design and takes advantage of some of the new features in Web Connection 6.0 including Layout Pages, partials and sections.
I also want to remind those of you that attended SW Fox that there's a 10% discount available on all West Wind products. You can use the discount code SWFOX_2015 on the shopping cart to apply the 10% discount. Please provide your SW Fox badge number with your purchase to qualify.
Dates for next year's Southwest Fox conference were announced for late September next year. If you haven't come before it's a great place to see people catch up with new ideas on how you can extend the life of FoxPro just a little longer while at the same time gaining new skills. Mark your calendar.