Planet FoxPro

December 19, 2014

Rick Strahl's FoxPro and Web Connection Web Log

Use wwJsonSerializer to create Sample Data

I frequently create demos for various applications and components and quite frequently I create objects on the fly using EMPTY objects using ADDPROPERTY(). The code that does is a bit verbose and looks something like this:

*** Note objects are serialized as lower case
loCustomer = CREATEOBJECT("EMPTY")
 
*** Recommend you assign your own ids for easier querying
ADDPROPERTY(loCustomer,"_id",SYS(2015))
ADDPROPERTY(loCustomer,"FirstName","Markus")
ADDPROPERTY(loCustomer,"LastName","Egger")
ADDPROPERTY(loCustomer,"Company","EPS Software")
ADDPROPERTY(loCustomer,"Entered", DATETIME())
 
loAddress = CREATEOBJECT("EMPTY")
ADDPROPERTY(loAddress,"Street","32 Kaiea")
ADDPROPERTY(loAddress,"City","Paia")
ADDPROPERTY(loCustomer,"Address",loAddress)
 
loOrders = CREATEOBJECT("Collection")
ADDPROPERTY(loCustomer,"Orders",loOrders)
 
loOrder = CREATEOBJECT("Empty")
ADDPROPERTY(loOrder,"Date",DATETIME())
ADDPROPERTY(loOrder,"OrderId",SUBSTR(SYS(2015),2))
ADDPROPERTY(loOrder,"OrderTotal",120.00)
loOrders.Add(loOrder)
 
loOrder = CREATEOBJECT("Empty")
ADDPROPERTY(loOrder,"Date",DATETIME())
ADDPROPERTY(loOrder,"OrderId",SUBSTR(SYS(2015),2))
ADDPROPERTY(loOrder,"OrderTotal",120.00)
loOrders.Add(loOrder)

While this works, it’s pretty verbose and a little hard to read with all the nesting and relationships. It’s still a lot less code than explicitly creating a class and then assigning properties but even so, it’s hard to see the relationship between the nested objects. The more nesting there is the more nasty this sort of code gets regardless of whether you use AddProperty() or CREATEOBJECT() with assignments.

Given that Json Serialization is readily available now, it’s actually pretty easy to replace this code with something that’s a little easier to visualize.

Take the following code which produces a very similar object by using a JSON string serialized to an object:

DO wwJsonSerializer
 
*** Create a sample object
TEXT TO lcJson TEXTMERGE NOSHOW
{
_id: "<< loMongo.GenerateId() >>", FirstName: "Rick", LastName: "Strahl", Company: "West Wind", Entered: "<<TTOC(DateTime(),3)>>Z", Address: { Street: "32 Kaiea", City: "Paia" }, Orders: [ { OrderId: "ar431211", OrderTotal: 125.44 }, { OrderId: "fe134341", OrderTotal: 95.12 } ] }
ENDTEXT
loSer = CREATEOBJECT("wwJsonSerializer") loCustomer = loSer.DeserializeJson(lcJson) ? loCustomer.LastName ? loCustomer.Entered ? loCustomer.Address.City loOrder = loCustomer.Orders[1] ? loOrder.OrderTotal

The data is created in string format and embedded inside of a TEXTMERGE statement to produce a JSON string. The JSON string is then sent through a JSON deserializer which in turn produces a FoxPro object (created from EMPTY objects and Collections).

Note that you can also inject data into this structure because of the TEXTMERGE clause that allows merging FoxPro expressions. again this works well to essentially ‘script’ your object definition.

Caveats

I use this mostly for demos and other scenarios where I need to create some static data quickly and for that it works very well because I can control the content exactly with the values I push into the object.

However, keep in mind that JSON requires data to be encoded to a certain degree. Strings need to escape extended characters, quotes and a host of other things. If you’re using Web Connection you can use the JsonString() and JsonDate() functions in the wwUtils library.

TEXT TO lcJson TEXTMERGE NOSHOW
{   
_id: "<< loMongo.GenerateId()" >>, FirstName: << JsonString(lcFirstName) >>, LastName: <<JsonString(lcLastName) >>, Entered: << JsonDate(DateTime()>>,
… }

Although this gets a little more verbose as well, it’s still easier to read than the pure FoxPro object syntax.

Visualize

Anyway – something to keep in mind. When you need more expressive syntax to construct objects – and especially complex objects – a JSON Serializer might just do the trick and provide you complex structure in a more visual way.

by Rick Strahl at December 19, 2014 04:43 AM

December 18, 2014

Alex Feldstein

December 17, 2014

Alex Feldstein

December 16, 2014

Rick Strahl's FoxPro and Web Connection Web Log

Visual Studio Community Edition–Visual Studio for Web Connection

A couple of months ago Microsoft announced the release of a new Visual Studio Community version to replace all the Express versions. As you probably know the Express versions of Visual Studio were severely trimmed down and specialized versions of Visual Studio  – essentially a core version of Visual Studio with only certain packages enabled. While these versions were functional and certainly provided a lot of value, each of the Express versions had some major features missing. The Visual Studio Community edition is a full version of Visual Studio Professional and is available to most non-large enterprise developers free of charge.

Web Express Limitations for Web Connection

In the past I’ve recommended using Visual Studio Express for Web, especially for developers that were taking advantage of the Web Control Framework. It provides the excellent HTML, CSS and JavaScript editors, and explicit support for WebForms which is what the Web Connection Web Control Framework uses to provide it’s designer support in Visual Studio. Express version had some limitations that definitely impacted Web Connection, namely lack of support for the Web Connection Add-in which provides the ability to bring up Visual FoxPro and your favorite browser directly from a Web Connection page.

With the intro of Visual Studio Community edition, that limitation – and many others are gone.

Visual Studio Community Edition is Visual Studio Professional

The new Community edition is essentially a free version of Visual Studio Professional for most individuals and small organizations. It provides the full feature set of Visual Studio Pro which includes the support for plug-ins, multiple projects and the whole gamut of projects that Visual Studio supports. There are no limitations or disablements – it’s a full version of Visual Studio Pro.

If you are a Web Connection developer and you’ve been using the Web Express Edition you should definitely go ahead and download and install Community Edition to get support for the Web Connection Add-in. This version also allows you to open multiple projects in the same solution so you can have multiple Web sites open in one project, as well as open the Web Connection Web Controls design time control project for creating your own components.

You can download the Community Edition from the Visual Studio Download site

Who can use the Community Edition

As mentioned the Community Edition is free to most small to medium sized organizations. With a few exceptions of larger organizations you can use Visual Studio Community edition for free. Here’s what the Community Edition site shows for who can use the Community Edition:

  • Any individual developer can use Visual Studio Community to create their own free or paid apps.

Here’s how Visual Studio Community can be used in organizations:

  • An unlimited number of users within an organization can use Visual Studio Community for the following scenarios: in a classroom learning environment, for academic research, or for contributing to open source projects.
  • For all other usage scenarios: In non-enterprise organizations, up to 5 users can use Visual Studio Community. In enterprise organizations (meaning those with >250 PCs or > $1 Million US Dollars in annual revenue), no use is permitted beyond the open source, academic research, and classroom learning environment scenarios described above.

What’s the Downside?

I think that the Community Edition is a huge step in the right direction for Microsoft. Developer tools are becoming more and more of a commodity item and are used more to sell a platform than anything else, and this edition at least provides a full featured development tool for those that want to work with Microsoft technologies. The fact that the HTML, CSS and JavaScript editors are top notch for any kind of Web development is an additional bonus.

The one downside is that Visual Studio is large. It uses a fair amount of system resources so you need a reasonably fast machine to run it. However, it’s nothing like older versions of Visual Studio that were. If it’s been few years since you’ve last tried Visual Studio you really should give it another shot as performance and especially editor performance and load times have improved significantly.

As a Web Connection Developer Should you care?

If you’re using the Web Control Framework you will definitely want to use the Community Edition as it gives you full support for the designer integration and the Web Connection Add-in to load your WCF pages. If you’re using any other edition, go make the jump – it’s totally worth it.

But even if you’re not using the Web Control Framework that provides integration with the WebForms designer, there are many features in Visual Studio’s HTML, CSS and JavaScript editors that are top notch if not the best compared to just about anything out there. The only other tool I would put even close in comparison is Jetbrains WebStorm which is an excellent tool for pure HTML5 and JavaScript (including Node) based applications.

If you are using any version of Visual Studio 2012 or 2013 (including the Community Edition) make sure you install Web Essentials which provides additional HTML, CSS and JavaScript features including support for LESS, SASS, Zen Coding support (very efficient HTML shortcut template language), automatic JS and CSS minification and much more. As the name implies it’s essential and I don’t consider this an optional add-in.

Regardless of whether you do Web Connection development, or raw HTML coding, here are some features in Visual Studio that I wouldn’t want to live without:

  • HTML Element and Attribute IntelliSense
  • HTML and CSS Formatting (as you type or explicit)
  • CSS IntelliSense (CSS properties, reference props in docs)
  • F12 Navigation to CSS classes
  • Automatic LESS parsing and saving
  • Excellent JavaScript IntelliSense (especially with _references.js file)
  • Basic JavaScript Refactoring

Web Essentials:

  • Zen Code
  • Script and CSS Compression
  • HTML Color Picker, Darken/Lighten
  • CSS Browser Support Preview
  • CSS Vendor Prefix injection
  • Font and Image Previewers

There’s lots more but those are some of the highlights for me. If you haven’t tried Visual Studio in a long while or where put off by the pricing of the full versions, give the community edition a try…

by Rick Strahl at December 16, 2014 11:04 AM

Alex Feldstein

December 15, 2014

VisualFoxProWiki

SouthwestFox

Southwest Fox 2015


October 15-18, 2015
SanTan Elegante Conference & Reception Center > in Gilbert, Arizona<br> <br> <a href=http://swfox.net
Blog: http://swfox.net/blog/index.php
RSS/Atom: http://swfox.net/blog/feed
Twitter: http://twitter.com/swfox

December 15, 2014
Geek Gatherings LLC is officially asking you to save the dates for Southwest Fox and Southwest Xbase++ 2015! The conferences take place October 15-18, 2015.

We look forward to seeing you in Arizona in October!

Regards,
Rick Schummer
Doug Hennig
Tamar Granor

December 15, 2014 08:01 PM

SouthwestFox2014

Southwest Fox 2014


October 16-19, 2014
SanTan Elegante Conference & Reception Center > in Gilbert, Arizona<br> <br> <a href=http://swfox.net
Blog: http://swfox.net/blog/index.php
RSS/Atom: http://swfox.net/blog/feed
Twitter: http://twitter.com/swfox

October 7, 2014
We are streaming the keynote presentation, Thursday at 7:00 p.m. Mountain Standard Time (Arizona does not observe Daylight Savings Time), again this year. http://www.ustream.tv/channel/swfoxtv

October 3, 2014
The Guidebook for Southwest Fox/Xbase++ 2014 is now available.

September 10, 2014
Alaska Software is offering special discounts on Xbase++ 2.0 for Southwest Fox and Southwest Xbase++ 2014 attendees.

August 22, 2014
Alaska Software has a news release titled "Xbase++ meets Visual FoxPro" in which they discuss the relationship between Xbase++ and VFP and what their sessions at Southwest Xbase++ will cover.

July 28, 2014
Listen to Andrew MacNeill interview Rick Schummer, Tamar Granor, and Doug Hennig about Southwest Fox 2014 on the
FoxShow #79. If you haven't registered yet, be sure to listen for a special offer in the show.

July 11, 2014
The winners of the White Light Computing scholarships are Michael Hogan and David Dulberg. The winner of the Tomorrow's Solutions scholarship is Patrick Murtaugh. The winner of the Stonefield Query SDK is Randy Godfrey.

 class= May 27, 2014
Registration for Southwest Fox and Southwest Xbase++ 2014 is now open.

 class= May 15, 2014
Speakers and sessions for Southwest Fox and Southwest Xbase++ 2014 have been announced.

 class= March 6, 2014
Geek Gatherings invites prospective speakers to submit session proposals for Southwest Fox and Southwest Xbase++ 2014. See http://www.swfox.net/CallForSpeakers.aspx for details.

January 7, 2014
The dates for Southwest Fox and Southwest Xbase++ 2014 have been announced: October 16-19, 2014.

We look forward to seeing you in Arizona in October!

Regards,
Rick Schummer
Doug Hennig
Tamar Granor

December 15, 2014 08:00 PM

Doug Hennig

Save the Dates for Southwest Fox 2015

Geek Gatherings LLC is officially asking you to save the dates for Southwest Fox and Southwest Xbase++ 2015! The conferences take place October 15-18, 2015.

This year we again have two conferences as one great event at the same location. The conferences offer developers an opportunity to learn and extend their skills, and network with fellow developers. You get two conferences for the price of one!

The conferences take place at the San Tan Elegante Conference and Reception Center, the same great location as last year.

At the 2014 conferences, we mentioned our risks in running the conferences, primarily the tens of thousands of dollars we have to commit to the conference center. Fortunately, we found a way to mitigate those risks to an acceptable level, so we're happy to be hosting the conference again in 2015.

We choose speakers from the developer community based on their session proposals. We are always looking for new speakers. Watch the Call for Speakers page of the conference website; we'll post the Call for Speakers in mid-February.

All the details about registration, speakers, sessions, and more will be available in May.

Special Request!!!

If there are any topics you hope will be covered this year, please send them to info@geekgatherings.com, right away.

by Doug Hennig (noreply@blogger.com) at December 15, 2014 07:58 PM

VisualFoxProWiki

ChicagoFUDG

Editor comments: December program details
FUDG stands for FoxPro User / Developer Group. This one is based in Chicago, IL. Visit our web site at www.ChicagoFUDG.com !

Leaders:
Michael Hogan
Randy Bosma Speaker Agitator
Jeff Simon Treasurer jeff@datamarkcorp.com
Tom Corrigan Secretary (his FUDG e-mail: tom@corrigan.com)
Greg Gershuny gel4it [at} hotmail {dot] com

Bill Drew Past President bill.drew@sbcglobal.net

Monthly Meetings:
When: Chicago FUDG normally meets on the third Monday of each month, unless it collides with a major event. The meeting room is normally accessible at about 5:15 PM; presentation starts at shortly after our 'business chatter' and introductions at 5:30.
Where: 1871 at the Merchandise Mart suite 1212 (enter from Orleans on the west side of the building and take the west elevators to the 12th floor). When you arrive at the 1871 front desk (adjacent to the west elevator bank) ask to be directed to the FoxPro meeting in the IMSA classroom. Reasonable ($10) parking is available 1 block north of the Orleans Street (west) entrance of the Merchandise Mart.

December 15, 2014 02:18 PM

UpcomingEvents

Editor comments: Chicago's December meeting
A place to list upcoming Visual FoxPro events like conferences, meetings, user groups, open training sessions...
Closest at the top please, and please remove past events.

December 15, 2014 01:39 PM

Alex Feldstein

December 14, 2014

Alex Feldstein

December 13, 2014

Articles

Ten years of blogging - 400+ articles written

There are 86400 seconds every day, and literally every single one matters...

Delayed entry - delayed jubilee

Unfortunately, it usually doesn't work this way. Mea culpa, this article should have been written already back in July this year when this blog had it's actual 10 year jubilee - to be precise: 12.07.2004. My first blog article is "Verspäteter Einstieg in die Welt der Blogs" meaning "Delayed entry into the blogosphere". Yes, you read it correctly... at that time blogging was already very hip and honestly I was somehow reluctant to start blogging. Of course, I already knew what "web logging" was all about at that time but I didn't have the drive to actually start something of this kind.

Well, now we are at the end of 2014 and my stats don't look that bad actually. Sure, there are others with more regular publications and more interesting content but whatever you might say, it's my personal online log book and it will stay like this.

Let's get bored with some stats

So far, I managed to kick myself more than 400 times to sit down and type some content. That's roughly one article every nine days on average. Well, almost what I had on my mind... one article per week. So you see that there is still lots of room for improvement after all. Furthermore, I'd like to mention that the experience I had with this blog was quite a roller coaster ride - currently, this is the third iteration of blog software in the background. Originally, I started to write using a software system called "dasBlog", then after a severe server crash-down I migrated almost all content to a system called "blogEngine.NET", and after another data / server problem I finally settled on Joomla! CMS with a blog layout style.

But there had been other forces during those 10 years which had a mayor impact on the consistency of my writing. First of all, I immigrated to Mauritius back in 2007 and was completely occupied with building up a new company for my former employer, then there were huge changes in my personal life with at least three big events to report, and due to some financial issues I kept myself extremely busy with founding and running a start-up back in 2009 in order to keep the food on the table. Most interestingly, you will see that impact on the number of articles written here on the blog.

Stats: Blog articles per year
Stats: Blog articles per year

Let me give you a brief summary on each year...

2004

As mentioned earlier I started to publish only at the begin of the second half of the year, and I was quite reluctant whether I would be able to keep the work up over a longer period of time or not. Well, it was mainly to get familiar with the medium and to see whether it could be interesting to continue or not. Also, in 2004 I was invited to join the Microsoft CLIP (Community Leader/Insider Program) initiative. Thanks to that I lots of activities to write about.

2005

Thanks to the German Developers Conference back in November of the previous I got a better understanding of the benefits and advantages of entertaining an online log book. Most of the international speakers at the conference were already blogging months or not even since years. Seeing their content and the way of writing as well as the pros of expressing oneself on the internet kept me going. And CLIP events kept on coming, and I published my experience as well as my adventures regularly on my blog, too.

2006

Seeing the feedback from others and let's call it personal success of writing more than 2 articles per week on average, I continued to write regularly on my blog. Mainly it was about technical things related to my main programming language - Microsoft Visual FoxPro - at that time, but still a lot of things related to my user group and community activities. Again thanks to the German Developers Conference I got positive feedback from other international speakers that they actually enjoyed reading my blog - even though I wrote in German language only.

2007

Well, well... Time to immigrate to Mauritius and occupying myself with other tasks. My former employer asked me to come here and to start a newly founded company with an initial team of 15 employees. We hired 10 freshmen directly out of university, 2 more experienced developers as team lead, a secretary for the daily paper work, a Mauritian director for all kind of labour, business and accounting related information and last but not least myself as project coordinator and supervisor - the communication channel back to Germany.

2008

Remorse... this could be the right word for 2008. Due to the abundance in the previous year, I thought that it would be time to pick up the blog again and try to publish a little bit more than the time before. Well, it didn't turn out too busy but after all I managed to write at least one article bi-weekly. Not too technical for a start but some content after all the obstacles I had to experience in that particular year.

2009

After almost 10 years my employment relationship came to an end due to the (in)abilities of my former employer. Long story short: I had to find a new source of income which took my full attention. Absolutely no time after all to entertain and maintain this blog. On private side there were some changes at the end of 2008, too. Taking this into consideration, there was surely other things way more important than publishing technical articles. I hope you didn't miss my ramblings too much... 

2010

Despite heavy workload and long hours of programming in order to deliver results and to exceed my clients' expectations I somehow was able to write once or twice a month. Honestly, I didn't have the drive to write but still I managed to publish bits and pieces.

2011

BAM! Mental shutdown and focus on way more important things... My family expanded, we had to look into other business models as the software development company had a mayor setback to deal with. Absolutely no chances to settle and write anything during this time. Instead of, I started to be more active on Facebook for my wife's business. So much work,so little time... 

2012

Slowly but surely getting back into the blogosphere. Things have changed on the business side as well as in my private life - finally more relaxed and more positive than the months before. Honestly, I would say that 2011/2012 was one of my darkest chapters in life so far, and as things settled to look a bit more rosy I found some more time to write again. Still more on the technical side but other things as well. Life kicked back in and I had pleasure to tinker with this blog once more.

2013

I don't know when but finally I made up my mind that I should be focusing on publishing at least one article per week on average. And by the end of 2013 I can proudly say that I was able to achieve this goal - even with a little bonus on top. This is surely also based on the initiation of the Mauritius Software Craftsmanship Community (MSCC) back in May, and therefore our regular meetups and get-togethers. There is always so much information to write about. Sometimes, it is a real challenge to get my thoughts in proper order and to find the right words to put my experience of the day into one blog article. But surely I can say, it's with great pleasure to report an increased number of articles.

2014

As the end of the year is closing in I have to admit that there is a low number of entries missing in order to achieve my average of one article per week. But let's see what can be done. I still have some unfinished content in the pipe, and frankly I'm looking forward to complete them during the remaining three weeks of the year. Thanks to a stable business environment and the gratitude of running the MSCC successfully it's also easier to sit down several times a month to write something.

Plans for the future

Hahaha... Staying healthy, having good and prosperous business relationships, and enjoying family life.

Surely, I'm going to work on my ratio of one blog entry per week but you never know... There are so many opportunities coming along, and it will be a challenge to keep up writing about them. But, there will be content - rest assured about that. Not sure what exactly but definitely something. and hopefully something interesting and entertaining for my readers.

Thank you, dear reader

I would also like to thank my readership in this article. There had been great feedback on a good number of articles and it actually encourages me to keep on blogging. So, please feel free to drop me a note in the comment section below, and I'll look into it... Thanks and hopefully 10 more years of excitement and interesting information. ;-)

by Jochen Kirstaetter (jochen@kirstaetter.name) at December 13, 2014 07:20 PM

Alex Feldstein

December 12, 2014

Articles

Good to know: Sender Policy Framework

Sender Policy FrameworkToday, I ran into a "funny" situation where I got caught by my own mail server and DNS configuration. Actually, I'm referring to the Sender Policy Framework (SPF) and it disallowed that an email would have been delivered on my behalf.

Delivery Status Notification

Earlier on I wanted to share a document on OneDrive with my client, and was surprised that he didn't get any invitation by email within the usual 5 to 10 minutes. Well, it turned out that the email had been declined with a Delivery Status Notification (SMTP 550): 

Reporting-MTA: dns;DUB004-OMC2S4.hotmail.com
Received-From-MTA: dns;DUB131-DS14
Arrival-Date: Fri, 12 Dec 2014 03:22:13 -0800

Final-Recipient: rfc822;client@example.com
Action: failed
Status: 5.7.1
Diagnostic-Code: smtp;550 5.7.1 <client@example.com>: Recipient address rejected: Please see http://www.openspf.net/Why?s=mfrom;id=....

That's good!

SPF is configured via DNS

Although SPF is used for mail transfers it is configured in the DNS records of a domain. There you should specify an SPF record, or at least a TXT record with similar content to this:

v=spf1 a mx a:kirstaetter.name ptr:smtp.kirstaetter.name mx:smtp.kirstaetter.name -all

The explanation of the various mechanisms for the configuration of an outbound mail server is available in the Sender Policy Framework Record Syntax. And it is actually not too hard to learn and apply.

Rather be safe than sorry

In case that you didn't configure SPF for your domain(s) yet. Please, go ahead and do yourself and mainly other internauts a favour and set-up your DNS records accordingly. It doesn't take that much time but improves your reputation as an outbound mail host.

by Jochen Kirstaetter (jochen@kirstaetter.name) at December 12, 2014 05:03 PM

FoxCentral News

West Wind Internet and Client Tools 5.70 released

 West Wind Technologies has released version 5.70 of West Wind Client Tools. Client Tools provide a host of Internet functionality to FoxPro applications including support for SMTP email, rich HTTP access, FTP up and downloads as well as high level data services REST, SOAP and HTTP based data services. The toolkit also contains many utility classes including a simple business object layer, SQL Server data access wrapper, .NET bridge helper to access .NET code from FoxPro, JSON and XML serializers, PDF rendering and more. provides many enhancements to the JSON serializer and wwDotnetBridge functionality and a number of important bug fixes. Updates are free to registered users of version 5.0 and later, and a free shareware version is available for download to check out the product.

by West Wind Technologies at December 12, 2014 09:38 AM

Alex Feldstein

December 11, 2014

Alex Feldstein

Rick Strahl's Web Log

Mixing $http Promises and $q Promises for cached Data

If you use $http Promises in your Angular services you may find that from time to time you need to return some data conditionally either based on an HTTP call, or from data that is cached. In pseudo code the idea is something like this from an Angular service (this code won’t work):

function getAlbums(noCache) {
    // if albums exist just return
    if (!noCache && service.albums && service.albums.length > 0) 
        return service.albums;
return $http.get("../api/albums/") .success(function (data) { service.albums = data; }) .error(onPageError); }

The idea is that if the data already exists, simply return the data, if it doesn’t go get it via an $http call and return the promise that calls you back when the data arrives.

Promise me

The problem with the above code is that you can’t return both straight data and a promise from the same method if you expect to handle the result data consistently in one place.

Most likely you’d want to write your controller method using code like this:

vm.getAlbums = function() {
    albumService.getAlbums() 
        .success(function(data) {
            vm.albums = data;
        })
        .error(function(err) {
            vm.errorMessage='albums not loaded');
        });            
}

The code is expecting a promise – or even more specifically an $http specific promise which is different than a standard $q promise that Angular uses. $http object promises have .success() and .error() methods in addition to the typical .then() method of standard promises. I’ve covered this topic in some detail a few weeks back in another blog post.

So in order to return a consistent result we should return an $http compatible promise. But because of the special nature of $http promises the following code that creates a promise and resolves it also doesn’t quite work:

function getAlbums(noCache) {

    // if albums exist just return
    if (!noCache && service.albums && service.albums.length > 0) {
        var def = $q.defer();
        def.resolve(service.albums);        
        return def.promise;
    }
    
    return $http.get("../api/albums/")
        .success(function (data) {                    
            service.albums = data;                   
        })
        .error(onPageError);
}

While the code works in that it returns promise, any client that tries to hook up .success() and .error() handlers will also fail with this code. Even if the consumer decided to use .then() (which both $http and plain $q promises support) the values returned to the success and error handlers are different for the $q and $http callbacks.

So to get this to work properly you really have to return an $http compatible promise.

Some Helpers to make it Easier

Because this seems to be a common scenario that I run into, I created a couple of helpers to facilitate this scenario with a couple of helper functions that can fix up an existing deferred and/or create a new completed promise directly.

(function(undefined) {
    ww = {};
    var self;
    ww.angular = {
        // extends deferred with $http compatible .success and .error functions
        $httpDeferredExtender: function(deferred) {
            deferred.promise.success = function(fn) {
                deferred.promise.then(fn, null);
                return deferred.promise;
            }
            deferred.promise.error = function(fn) {
                deferred.promise.then(null, fn);
                return deferred.promise;
            }
            return deferred;
        },
        // creates a resolved/rejected promise from a value
        $httpPromiseFromValue: function($q, val, reject) {
            var def = $q.defer();
            if (reject)
                def.reject(val);
            else
                def.resolve(val);
            self.$httpDeferredExtender(def);
            return def.promise;
        }
    };
    self = ww.angular;
})();

.$httpDeferredExtender() takes an existing, traditional promise and turns it into an $http compatible promise, so that it has .success() and .error() methods to assign to.

Using this extender you can now get the code that manually creates a $q deferred, to work like this:

function getAlbums(noCache) {
    // if albums exist just return
    if (!noCache && service.albums && service.albums.length > 0) {
        var def = $q.defer();
        def.resolve(service.albums);
        ww.angular.$httpDeferredExtender(def);
        return def.promise;
    }

    return $http.get("../api/albums/")
        .success(function (data) {                    
            service.albums = data;                   
        })
        .error(onPageError);
}

It works, but there’s a slight downside to this approach. When both the success and error handlers are hooked up two separate promises are attached. Both are called because you can attach multiple handlers to a single promise but there’s a little bit of extra overhead for the extra mapping.

Moar Simpler

Because the most common scenario for this is to actually return a resolved (or rejected) promise, an even easier .$httpPromiseFromValue() helper allows me to simply create the promise directly inside of the helper which reduces the entire code to a single line:

function getAlbums(noCache) {

    if (!noCache && service.albums && service.albums.length > 0) 
        return ww.angular.$httpPromiseFromValue($q, service.albums);
        
    return $http.get("../api/albums/")
        .success(function (data) {                    
            service.albums = data;                   
        })
        .error(onPageError);
}

This really makes it easy to return cached values consistently back to the client when the client code expects an $http based promise.

Related Resources

© Rick Strahl, West Wind Technologies, 2005-2014
Posted in Angular  JavaScript  

by Rick Strahl at December 11, 2014 12:35 AM

FoxCentral News

December at Chicago FUDG - Hank Fay on Lianja

 Looking for a modern but familiar development environment to easily transition to? There has been a great deal of interest in the cross-platform Lianja environment - supporting Web, Windows, Mac and Linux. We're looking forward to seeing this presentation by Hank Fay. Hank is a very experienced VFP developer and tools developer who has decided that Lianja will be his new focus. We hope you can make it! You know the Who. Here's the When: Monday, 15 December 2015, 5:30pm And the Where: 1871 on the 12th floor at 222 West Merchandise Mart Plaza - IMSA Classroom - Chicago, IL 60654

by Chicago FoxPro Users and Developers Group at December 11, 2014 12:14 AM

December 10, 2014

CULLY Technologies, LLC

Alex Feldstein

December 09, 2014

Alex Feldstein

December 08, 2014

The Problem Solver

Testing an AngularJS directive with its template

 

Testing AngularJS directives usually isn’t very hard. Most of the time it is just a matter of instantiating the directive using the $compile() function and interacting with the scope or related controller to verify if the behavior is as expected. However that leaves a bit of a gap as most of the time the interaction between the directives template and it’s scope isn’t tested. With really simple templates you can include them in the template property but using the templateUrl and loading them on demand is much more common, specially with more complex templates. Now when it comes to unit testing the HTTP request to load the template if not doing to work and as a result the interaction isn’t tested. Sure it is possible to use the $httpBackend service to fake the response but that still doesn’t use the actual template so doesn’t really test the interaction.

 

Testing the template

It turns out testing the template isn’t that hard after all, there are just a few pieces to the puzzle. First of all Karma can server up other files beside the normal JavaScript files just fine, so we can tell it to serve our templates as well. With the pattern option for files we can tell Karma to watch and server the templates without including them in the default HTML page loaded. See the files section from the karma.conf.js file below.

   1: files: [
   2:     'app/bower_components/angular/angular.js',
   3:     'app/bower_components/angular-mocks/angular-mocks.js',
   4:     'app/components/**/*.js',
   5:     'app/*.js',
   6:     'tests/*.js',
   7:     {
   8:         pattern: 'app/*.html',
   9:         watched: true,
  10:         included: false,
  11:         served: true
  12:     }
  13: ],

 

With that the files are available on the server. There are two problems here though. First of all when running unit tests the mock $httpBackend is used and that never does an actual HTTP request. Secondly the file is hosted at a slightly different URL, Karma includes ‘/base’ as the root of our files. So just letting AngularJS just load it is out of the question. However if we use a plain XMLHttpRequest object the mock $httpBackend is completely bypassed and we can load what we want. Using the plain XMLHttpRequest object has a second benefit in that we can do a synchronous request instead of the normal asynchronous request and use the response to pre-populate the $templateCache before the unit test runs. Using synchronous HTTP request is not advisable for code on the Internet and should be avoided in any production code but in a unit test like this would work perfectly fine.

So taking an AngularJS directive like this:

   1: angular.module('myApp', [])
   2:     .directive('myDirective', function(){
   3:       return{
   4:         scope:{
   5:           clickMe:'&'
   6:         },
   7:         templateUrl:'/app/myDirective.html'
   8:       }
   9:     });

 

And a template like this:

   1: <button ng-click="clickMe()">Click me</button>

 

Can be easily tested like this:

   1: describe('The myDirective', function () {
   2:     var element, scope;
   3:  
   4:     beforeEach(module('myApp'));
   5:  
   6:     beforeEach(inject(function ($templateCache) {
   7:         var templateUrl = '/app/myDirective.html';
   8:         var asynchronous = false;
   9:         var req = new XMLHttpRequest();
  10:         req.onload = function () {
  11:             $templateCache.put(templateUrl, this.responseText);
  12:         };
  13:         req.open('get', '/base' + templateUrl, asynchronous);
  14:         req.send();
  15:     }));
  16:  
  17:     beforeEach(inject(function ($compile, $rootScope) {
  18:         scope = $rootScope.$new();
  19:         scope.doIt = angular.noop;
  20:  
  21:         var html = '<div my-directive="" click-me="doIt()"></div>'
  22:         element = $compile(html)(scope);
  23:         scope.$apply();
  24:     }));
  25:  
  26:     it('template should react to clicking', function () {
  27:         spyOn(scope, 'doIt');
  28:  
  29:         element.find('button')[0].click();
  30:  
  31:         expect(scope.doIt).toHaveBeenCalled();
  32:     });
  33: });

 


Now making any breaking change to the template, like removing the ng-click, will immediately cause the unit test to fail in Karma.

 

Enjoy!

by Maurice de Beijer at December 08, 2014 07:33 PM

Alex Feldstein

December 07, 2014

Alex Feldstein

December 06, 2014

Alex Feldstein

December 05, 2014

Alex Feldstein

December 04, 2014

Alex Feldstein

December 03, 2014

FoxCentral News

Philadelphia VFP User Group: December 9: Menachem Bazian on jQuery

The next meeting of the Philadelphia VFP User Group is Tuesday, December 9 at 7 PM in room 104, DeVry University, 1140 Virginia Drive, Fort Washington, PA. As usual, feel free to bring some dinner and come as early as 6:30 PM. Menachem Bazian returns this month to teach how to ?Jazz your Web Applications with JQuery!? Abstract: This session is an introduction to the most powerful and widely used client side function library in the web development world. JQuery is both a function library and a foundation that makes web development easier and more powerful. It is also the basis for hundreds of really handy plugins that can make you look like a master web developer with little effort. This is an introductory session to JQuery and will introduce you to how it works and show you some of the great plugins I use daily in my web application development.

by Philadelphia Visual FoxPro User Group at December 03, 2014 09:12 PM

FoxProWiki

UpcomingEvents

A place to list upcoming Visual FoxPro events like conferences, meetings, user groups, open training sessions...
Closest at the top please, and please remove past events.

December 03, 2014 09:07 PM

VFP Philly

Tuesday, December 9: Menachem Bazian: "Jazz you Web Applications with JQuery!"



Our next meeting is next Tuesday, December 9. As usual, feel free to bring some dinner and come as early as 6:30 PM.

Menachem Bazian returns this month to teach how to “Jazz your Web Applications with JQuery!”

Abstract: This session is an introduction to the most powerful and widely used client side function library in the web development world. JQuery is both a function library and a foundation that makes web development easier and more powerful. It is also the basis for hundreds of really handy plugins that can make you look like a master web developer with little effort.

This is an introductory session to JQuery and will introduce you to how it works and show you some of the great plugins I use daily in my web application development.


by Tamar E. Granor (noreply@blogger.com) at December 03, 2014 09:01 PM

Alex Feldstein

December 02, 2014

CULLY Technologies, LLC

Open-sourced MicroSoft .NET vs Xojo

A client recently contacted me to ask my my opinion about this piece of news: Microsoft to open source more of .NET, and bring it to Linux, Mac OS X

Was I surprised? Not really. Once Ballmer was gone, MicroSoft was hinting there would be big changes. This is pretty big news.

However, I don’t really see any changes in my professional plans. Even if the development of an application is with the Windows version of .NET, the deployment of a Mac or Linux version of the application will basically be brand new and subject to performance issue. (Speed or stability.)

The Mono Project, an open-sourced version of .NET, has been passed around from Ximian, to Novell, to Attachmate and now Xamarin. I love open-sourced projects but this one has a bit of an orphan problem surrounding it. Now that .NET is open-sourced, then the Mono team will be able to directly look at a lot of the code instead of just inferring operation based on behavior and documentation. This may give them a leg up.

I’m still confused on whether MicroSoft will create their own cross-platform .NET team or whether they will join forces with the Xamarin development/support team.

I don’t know what will happen with .NET and its cross-platform capabilities but I think I’ll keep on my current course and stick with Xojo for the foreseeable future. Xojo is here today and I really like developing in it. I haven’t found anything better that develops desktop applications on Linux, Mac and Windows plus web applications as well.

by kcully at December 02, 2014 09:44 PM

My Black Friday mistakes

My first mistake was answering the phone.

Let me back up. I was railing, RAILING, on the stores that were [1] open on Thanksgiving and/or [2] opening even earlier than previous years on Thanksgiving. I was trying to shame stores into closing on Thanksgiving for next year. I promised to spend less at stores that were open on Thanksgiving.

Do I have a problem with stores making a profit: No. I love it when stores make money by providing goods and services at a reasonable price with good customer service.

Should it be against the law for individuals to shop at stores on Thanksgiving. A big No there. People may think differently than I do and I shouldn’t take away their freedom for something simple like this. We can disagree and I’m okay with that.

Should it be a law that stores are closed on Thanksgiving? No. That’s ridiculous. Stores should do as they see fit to provide the goods and services to their customers. The flip side is that I should be free to speak that I will not be spending as much money at these stores.

I want to use my fiscal power and my social media contacts to help encourage stores to be closed on this holiday and to allow their workers to be home with their families. I wasn’t able to spend time with my nephew on Thanksgiving because he was working in a big box electronics store that day. In my opinion, people can spend extra money on Friday, Saturday and every day up to Christmas as they see fit. I felt bad for the people compelled to work.

Here’s where I went wrong:

1. I answered the phone

My brother was calling me. Normally he only calls when he needs technical support. I guess this might qualify.

2. He wanted me to go out on Black Friday…

I was doing some home improvement around the house. Happy as a clam. I don’t think I’ve ever been out shopping on Black Friday but my brother was begging me to leave the comforts of my home for …

3. He wanted me to go to Target …

Yup, Target store. Recently highlighted in the news for their hacked credit card information and registers. Target that was, not only open on Thanksgiving, but opening their stores even earlier this year than previous years.

4. To buy an Apple Ipad Air …

I really, really don’t like Apple as a company. They have proprietary hardware and software that appears to lock in their customers to their services. Sigh. For more information refer to this article on 5 Reasons You Should Be Scared of Apple or 9 Signs You Should Be Scared of Apple or Forget Google – it’s Apple that is turning into the evil empire or many more. Plenty of reasons to avoid Apple products.

My brother begged me to get the iPad Air (not the iPad Air 2) for a relative. (Not me BTW). It was on sale for $399 with a $100 Target gift card. Was that a good deal? I have no clue. It’s what he wanted me to get.

So I guess I’m a sucker. I went out to Target. {Shudder} It was like being in an episode of “The Walking Dead”. I survived it but I needed a drink when I got home. And I don’t normally drink. Crowded. Not very much help. Crowded. I survived.

Moral of the story: I shouldn’t answer my phone. And that’s what I did wrong on Black Friday.

by kcully at December 02, 2014 08:42 PM

The Problem Solver

Rick Strahl's Web Log

Creating multi-target NuGet Packages with vNext

One feature that is going to make it much easier to create multi-targeted NuGet packages is the ability of the vNext platform to package compiled code directly into NuGet packages. By default vNext applications don’t compile to disk but rather create source code in the AppCode folder. A running application  then reads this source code and compiles the code on the fly via Roslyn to execute it from memory.

However, if you build class libraries you can also optionally write out the result to disk, which creates a NuGet package. That’s cool all by itself, but what’s even nicer is the fact that you can create multiple build targets for different versions inside of that NuGet package. You can create output for vNext Full and Core and even standard .NET and PCL components – all from a single project!

It’s essentially very easy and natural to produce a NuGet package like this:

NuGetPackage

This package contains output for vNext Full CLR (aspnet50), vNext Core CLR (aspnetcore50) and the full .NET Runtime (net45).

If you’ve built multi-targeted assemblies and packages before you probably know how much of a pain this was in previous versions of .NET and Visual Studio. You either had to hack the MSBUILD process or else use separate projects, Solution build targets or separate solutions altogether to accomplish this. In vNext you can do this with a few simple project settings. You can simply build your project with output options turned on both from within Visual Studio or from the command line without using MsBuild (yay!) and produce a NuGet package as shown above.

That’s pretty awesome!

Creating a Project

As part of my exploration of vNext I’m in the process of moving a few of my helper libraries to vNext. This is turning out to be a challenge if you plan on supporting the Core CLR which has a fairly restricted feature set. The vast percentage of code works as is, but there’s also a fair bit – and some of it surprising - that doesn’t run as is. And there’s an awful lot of looking for packages and namespaces to get the features that I know that are there…

For initial testing I used my Westwind.Utilities library and just pulled out one of the classes – the StringUtils class. I used this one because it has very few system dependencies so I hoped it would just run as is even under vNext. Turns out it doesn’t – even this very basic class has a few pieces that don’t exist under vNext or at least not with the same signatures. Which makes it perfect for this example as I have a few methods I need to bracket out for Core CLR usage.

Setting up a Library Project

In order to set this up the first thing I did is create a new Class library project in Visual Studio.

NewClassLibraryProject

By default Visual Studio creates a project.json file with the two ASP.NET nNext targets (aspnet50 and aspnetcore50). In addition I explicitly added the .NET 4.5 target (net45) in project.json (which is actually what’s shown in the project above).

Here’s what project.json looks like:

{ "version": "1.0.0-*", "dependencies": { }, "frameworks": { "net45": { "dependencies": { } }, "aspnet50": { "dependencies": { } }, "aspnetcore50": { "dependencies": { "System.Runtime": "4.0.20-beta-*", "System.IO": "4.0.10-beta-*", "System.Runtime.Extensions": "4.0.10-beta-*", "System.Text.Encoding": "4.0.0-beta-*", "System.Text.RegularExpressions": "4.0.0-beta-*", "System.Linq": "4.0.0-beta-*", "System.Reflection": "4.0.0-beta-*", "System.Reflection.Extensions": "4.0.0-beta-*", "System.Reflection.TypeExtensions": "4.0.0-beta-*", "System.Threading.Thread": "4.0.0-beta-*", "System.Threading.Tasks": "4.0.0-beta-*", "System.Globalization": "4.0.0-beta-*", "System.Resources.ResourceManager": "4.0.0-beta-*" } } } }

The three highlighted targets correspond to the References nodes in Visual Studio project and correspond to the 3 different build targets of the project.

Note that I also explicitly have to reference any of the BCL components I’m using in my component for the Core CLR  target.  The other two are getting these same components from GAC components of the full CLR so they don’t need these. Since I’m including a ‘classic’ .NET 4.5 target here I have to be careful of how I add references – all vNext references that apply to both vNext Core and Full CLR need to be explicitly assigned to their dependency nodes, while any dependencies of the the full .NET runtime needs to go in its dependency section.

If you target only the two vNext versions you can use the global dependency node for any shared components which is a lot less verbose.

Note that for the Core CLR I have to manually add all the little tiny packages for the BCL classes that used to live in mscorlib and system. I added these as I started tweaking my component – they aren’t there by default. The only default component is System.Runtime. While adding every little thing is a pain it does help with modularization where you get just what you ask for and nothing more. But to be honest I find it hard to believe that anything less than what I have above would ever not be used by either my own code or any referenced components (minus the regex maybe) so maybe this is just getting a little too granular.

If you’re building projects that use more high level components (like EntityFramework or the new ASP.NET MVC) you’ll find that most of the things you need to reference are already referenced by those higher level components, so some of this minute package referencing goes away. But if you’re writing a core component that has minimal non-system dependencies you’ll find your self doing the NuGet Package Hula!

To help with finding Packages and Namespaces you might find  http://packagesearch.azurewebsites.net  useful. Maintained by a Microsoft employee ( Glenn @condrong)  this tool lets you search for packages and namespaces in the vNext BCL/FCL libraries by name:

packageSearch 

Conditional Code

Once you have your targets defined you can start adding some code. If your code just works across all the targets defined you’re done. Writing greenfield code it’s not too difficult to write code that works across all platforms.

In my case however, I was backporting an existing component and I ran into a few code references that didn’t work in the Core CLR.

If you have multiple targets defined in your application, vNext will compile your code to all 3 targets and shows you errors for any of the targets that fail. In my case I ran into problems with various System.IO classes like StreamWriter and MemoryStream that don’t exist (yet?) in vNext. In Visual Studio the compilation error window shows the errors along with the target that failed:

CompileErrors

Note the first 3 errors refer to StreamReader related errors. Apparently StreamReader doesn’t exist in vNext or I’m missing a package reference. I can see that the problem is in aspnetcore50 based on the project name in the Project column.

I can now also look at that code in the Visual Studio editor and see the StreamWriter reference error there for Core CLR along with an overview of the code I’m calling and which targets are supported and which ones won’t work (nice):

NoStreamWriter

It’s a bit odd that StreamWriter is not working. In fact most of the stream related classes in System.IO don’t appear to be there. It makes  me think that either I’m missing a package or this is still under heavy construction by Microsoft. Either way it demonstrates the point that there may be things that may not work with Core CLR.

To get around this I can now choose to bracket that code like this effectively removing this function (or alternately rewrite the function using some other code). For now I’m just going to bracket out the offending method altogether like this (with a //TODO to come back to it later):

#if !ASPNETCORE50
        /// <summary>
        /// Simple Logging method that allows quickly writing a string to a file
        /// </summary>
        /// <param name="output"></param>
        /// <param name="filename"></param>
        public static void LogString(string output, string filename)
        {            
            StreamWriter Writer = File.AppendText(filename);
            Writer.WriteLine(DateTime.Now.ToString() + " - " + output);
            Writer.Close();
        }
#endif

If I then recompile or pack the project, I’ll get no errors.

The compiler constants available for the three target versions in this project are: ASPNET50,ASPNETCORE50,NET45. Each of these #define constants are implicitly created as upper case versions of the defined frameworks in project.json. You can use either of these to take particular action or bracket code for compilation.

Using the Component in another Project (Source Code)

If I flip over to my Web project and want to now use my component I can simply add a NuGet reference to it like this:

"dependencies": {
    "Microsoft.AspNet.Hosting": "1.0.0-beta2-*",
    "Microsoft.AspNet.Server.WebListener": "1.0.0-beta2-*",
    "Microsoft.AspNet.Server.IIS": "1.0.0-beta2-*",
    "EntityFramework": "7.0.0-beta2-*",
    "EntityFramework.SqlServer": "7.0.0-beta2-*",
    "Microsoft.Framework.ConfigurationModel.Json": "1.0.0-beta2-*",
    "Microsoft.AspNet.Mvc": "6.0.0-beta2-*",
    "AlbumViewerBusiness": "" 
"Westwind.Utilities": "", },

Once added I can now use in code just like any other package. When this Web project uses this ‘project reference’ it pulls the source code for the Westwind.Utilities and compiles it on the fly and executes it.

I can also get the same Runtime version information from IntelliSense that tells me whether a feature is supported for one of the versions I’m targeting in the Web project. My Web project targets vNext Full and Core CLR so if I try to use the StringUtils.LogString() method I get this:

LibraryNoSupportedMethod

You can see here that LogString is available for Full CLR operation, but not for Core CLR operation and IntelliSense lets you know. The compiler too will let you know that if you use LogString and targeting the Core CLR you will get an error.

As you can imagine bracketing code out is not always a good idea – it makes it much harder to reuse existing code or migrate code. But it’s quite common as you can see by the heavy refactoring that’s happening in the core BCL/FCL libraries that Microsoft is reworking and the many missing features that just aren’t there (yet?).

Building a NuGet Package

When I built my project above I simply use the default build operation which doesn’t actually generate any output. By default vNext runs the code directly from source code and compiles it into memory. In vNext the compiler acts more as a syntax checker than an actual compiler when you click the Build button in Visual Studio.

You can however force the compiler to generate output to disk by setting an option which creates – you guessed it – a NuGet package rather than just an assembly. If I go back to the Westwind.Utilities project now and click on the Project Properties I can get this option (which is very likely to get a lot more options for package creation):

buildOutput

Now if I build the project I get my NuGet package built:

NuGetPackage

I can now take that package and either publish it or share it as needed. Before publishing I could also go in and customize the nupkg using the NuGet Package Explorer:

 PackageExplorer

Note that the current Package Explorer doesn’t understand the new vNext runtime versions yet, but that’ll change and hopefully Microsoft will consider moving some of this functionality right into Visual Studio and the build dialog to edit and adjust the package meta data.

Packages made easy

Creating multi-targeted libraries is never easy, but these new features in vNext at least make it a lot easier to manage the process of building them from a single source code base without having to heavily tweak the build process – it just works out of the box.

© Rick Strahl, West Wind Technologies, 2005-2014
Posted in ASP.NET vNext  

by Rick Strahl at December 02, 2014 09:10 AM

Alex Feldstein

Rahul Desai's Blog

Compatibility with Microsoft Dynamics CRM 2015

This article focuses on recent and upcoming compatibility testing for Microsoft Dynamics CRM 2015. It also focuses on the most common compatibility questions that are processed by Microsoft Dynamics Technical Support. This article does not indicate Microsoft Dynamics CRM compatibility for all products. This article supplements the information in the Microsoft Dynamics CRM Implementation Guide….more @ the link below:

Compatibility with Microsoft Dynamics CRM 2015

by Rahul Desai at December 02, 2014 01:44 AM

December 01, 2014

FoxProWiki

UpcomingEvents

Editor comments: Added December 2014 LA Fox meeting details
A place to list upcoming Visual FoxPro events like conferences, meetings, user groups, open training sessions...
Closest at the top please, and please remove past events.

December 01, 2014 05:40 PM

Alex Feldstein

November 30, 2014

Alex Feldstein

Wanderers: a stunning video of space

 

Watch it full screen. Every background on every place is real.

How we miss Carl!

(Via Phil Plait/Slate)

by Alex Feldstein (noreply@blogger.com) at November 30, 2014 11:33 AM

November 29, 2014

Alex Feldstein

Rick Strahl's Web Log

Updating Assembly Redirects with NuGet

Here’s a little NuGet gem that  I didn’t know and just found out about today: You can get NuGet to explicitly re-write your reassembly redirects in your .config files based on the installed NuGet Packages in the project.

You can use the following command from the Package Manager console:

PM> Get-Project All | Add-BindingRedirect

This recreates all those assembly redirects that are defined in your web.config or app.config file for a solution and updates them to match the versions from the various packages that are installed in each project. IOW it refreshes the assembly redirects to the actually installed packages of each project.

Right on! This is something I run into quite frequently and this simple command fixes the problem easily!  If you want this to work for an individual project just remove the –all flag.

Thanks to @maartenballiauw and @tsimbalar who pointed out this command to me when I was griping on Twitter about mismatched assemblies after an update to the latest ASP.NET WebAPI and MVC packages in a couple of projects. If you get “Could not load file or assembly '<assembly>' or one of its dependencies.” errors when you know you have the package and assembly referenced in your project, you’re likely running into a problem related to assembly versioning and need assembly redirects. NuGet automatically creates redirects for you, but versions can get out of sync when projects with cyclical dependencies are upgraded in a single solution.

Maarten also wrote up a blog post on this and I don’t want to take away from Maartens’s post here, and instead just link you to that for a good deal more information:

Could not load file or assembly… NuGet Assembly Redirects

I thought it was important enough to repost this here, since it’s a little known command that probably can benefit many people. I know it’s definitely a problem I run into a lot because I have a few component libraries that take dependencies on high level framework libraries that rev frequently, so it’s easy for things to get out of sync. This will help me tremendously in making sure that libraries are properly redirected.

© Rick Strahl, West Wind Technologies, 2005-2014
Posted in .NET  NuGet  ASP.NET  

by Rick Strahl at November 29, 2014 09:19 AM

November 28, 2014

Alex Feldstein

November 27, 2014

Alex Feldstein

November 26, 2014

Alex Feldstein

Bohemian Rhapsody in Blue (mashup)

Scott Bradlee of Postmodern Jukebox doing a wonderful mashup of Gershwin and Queen.

image

by Alex Feldstein (noreply@blogger.com) at November 26, 2014 10:39 AM

Rick Strahl's Web Log

WebClient and GetWebResponse not firing on Async Requests

Ran into an unexpected behavior when implementing an async version of WebClient to download data. While you can override WebClient to capture the HttpWebResponse object to capture additional Http information that is missing from the WebClient class, I initially had problems captureing the Respone on async requests. In this post I discuss a use case, problem and the simple solution.

by Rick Strahl at November 26, 2014 01:15 AM

Calvin Hsia's WebLog

Create managed Tests for native code

In the old days of code development, the developer would do several steps repeatedly: 1. edit the code 2. Save 3. Compile 4. Link 5. Deploy (if necessary) 6. Start (or switch to) the debugger 7. Start the app under the debugger. 8. Examine the code behavior...(read more)

by CalvinH at November 26, 2014 12:15 AM

November 25, 2014

FoxProWiki

WordAutomationPrintMerge

I have revisited this. In doing so I have found a MUCH cleaner way to automate word from VFP and attach a data source.

The main issue is that if you don't get it exactly right, the user is presented with a system dialogue box.

This is pretty simple: Pass in the full path to the document file, and the excel file. Also pass in the name of the worksheet from the workbook that you want to use as merge data. I only got it to work when the worksheet name matched the workbook name, but I didn't fiddle around with it too hard.

I didn't get this to work with a named range, or using a R1C1 type references. IT also doesn't seem to work with docx files. :(

NOTE What you MUST get right are: the weird backward single quotes around the sheet name AND the $ at the end of the sheet name. I didn't know how to type the quotes so I ended up cutting and pasting them into the code ... and was shocked when it actually WORKED!! ?cs

function openDataSource( tcDocFile, tcXLSFile, tcSheet ) && tcSheet could also be a named range

  local loWord, loMerge, lcSheet
  loWord = createobject( 'word.application' )
  loWord.Document.Open( m.tcDocFile )

  loMerge  = m.loWord.ActiveDocument.MailMerge
  lcSheet = "SELECT * FROM `"+m.tcSheet+"$`"

  loMerge.openDataSource(m.tcXLSFile,,,,,,,,,,,'Entire Spreadsheet',m.lcSheet)

endfunc

November 25, 2014 07:50 PM

CULLY Technologies, LLC

My new music player on Linux is console based

I was listening to the Linux Action Show and they highlighted the application “Music on Console” a.k.a. MOC. I took it for a spin and was hooked.

MOC [play] - 11 Audioslave - Bring 'em Back Alive (Audioslave)_072

MOC was in the Ubuntu repos so adding it was easy via the Software Manager. You can also sudo apt-get install mocp. To run, open a terminal and enter mocp and it should start up. There are ways to customize the themes and make it look much nicer including having transparent backgrounds, themes and more. I’ll let you search for that on your own.

Here are a few shortcuts that are helpful:
(Remember that CaPiTaLiZaTiOn matters!)


? or h ==> get help
A ==> add music files recursively to the playlist
[Space] ==> play/pause currently selected song
s ==> stop playing
S ==> shuffle playlist
[Tab] ==> switch between panels
, ==> increase volume by 5%
. ==> decrease volume by 5%
Q ==> quit mocp

When I run htop, it barely shows 3% CPU utilization. Most of the time it is less than 2% CPU.

One feature request that I’d like is a way to shuffle the playlist order. You can play the playlist in shuffle order, but there’s no way of seeing what the next random song will be. If I could shuffle the playlist, I could play it in straight order. It’d be a ‘nice to have’ feature.

I liked Clementine and Banshee, but I didn’t need all of the album art and all of the confusion and overhead. MOC is really nice, simple, light weight and fast. That’s all I need.

More info:
How to install MOC
More MOC shortcuts
A MOC demo video

by kcully at November 25, 2014 03:06 PM