Happy New Year, New Blog for 2013

happy new year
happy new year

The past year (2012) has been a year full of new things for me.

Now ready for new ones up ahead for 2013 and hopefully I would be able to share more about the small things and learning that comes my way.

Happy New Year everyone and may your 2013 be full of great things!

Job Opportunity – UX Developer for Next Generation Marketing Analytics Cloud Application

We are looking for ambitious User Experience engineers that wants a new highly graphical, easy to use, multi-device cloud data management and processing for business intelligence and predictive analytics application.

Role: To be key engineer implementing a new highly interactive, data intensive cloud service for or business intelligence and predictive analytics application. This service will be use by a variety of users, many of them marketers which will put high demands on graphics, interaction and design. It will need to be built to that we can support multiple devices. The person in this role will have a strong influence on technology strategy and implementation, and work closely with skilled graphical designers.

Required skills:
* Strong HTML, CSS, Javascript skills.
* JSON experience
* Experience with interactive (AJAX) user interface implementation.
* Experience integrating with RESTful web services from javascript
* Architectural and design experience is building software that is maintainable and reusable.

Desired skills, skills that would be an additional benefit:
* Experience with HTML5.
* Experience with iOS and Andriod development.
* Experience with highly graphical user interfaces.
* Experience with data visualization.
* Experience with user interface code performance measurement and optimization.

* We are looking for people with different levels of experience, anything from freshly out of college to experienced developers may be a fit.

Job Opportunity – .Net UX Developer for Marketing Analytics Cloud Application

High impact project for ambitous .Net UI developer

Profile :

Self-motivated with a passion for Web development and UX implementation

Role : Update the agilone user interface.The design has already been completed;

Required skills:
* Ability to interpret and fulfill the Designer’s vision for sophisticated UI and high performing applications.
*  Creative individual with a strong sense of style, detail oriented down to pixel-perfect implementation
* Self-motivated and able to work effectively with minimal instruction and supervision,
* Strong HTML, CSS skills
* Strong client side scripting skills (Javascript, Ajax)
* Experience with developing UI on ASP.Net (C#)
* Experience with cross-browser UI development.

Desired skills, skills that would be an additional benefit:
* 3-5 years of hands on experience with UI development.
* XML, WebServices
* Experience using version control systems (SubVersion)

* We are looking for a contractor, or an employee. If it is an employee the person also need to fit the following profile: UX Developer

remove/delete svn folders

Since I’ve been using SVN and stumbled upon this technique I’ve also been using since then

This is one of the many (but easiest way) to remove SVN folders (.svn) recursively from a given folder. Effectively “unbinding a folder from SVN

Follow these easy steps: [more]

1) download this registry file : DeleteSVNFolders.reg (313.00 bytes) – see below notes for more information (security issues/verification and code source). I urge you to review them but if you don’t want to worry about the details and trust me enough then please go ahead and download.

2) double click on the file you downloaded

    you will be prompted to confirm that you really want to perform the action (differs depending on your OS). just continue and should all go well you will have a successful message/dialog

3) then to see it work go to the folder you want to remove the .svn folders from, then right click it

you should see the “Delete SVN folders” menu item

4) Click the said menu item and the .svn folders will be removed (there will be a command window that will show up displaying progress too – though if the folder structure is not that deep enough it might disappear very quickly).


And you’re done

** Notes

1) altering your registry not for the faint-hearted and you’d have to trust the author/publisher. In this case I do and I have been using it for years

    and if you could also look into it (edit e.g. notepad or text editor) and understand what it does if you want to make sure you’re not letting it do something you don’t want

    i have taken this from : Jon Galloway’s Shell Command – Remove SVN Folder and simply created the reg file to be downloadable (for those who are not so confident on how to deal with reg files)

more sql performance tips

As most of you know the general rule in performance tuning is …..

“measure, measure, measure” and “measure again”

But it doesn’t hurt to give out some warning signs from time to time. And here goes two: [more]

1) be careful in doing anything fancy on JOIN predicates (that is the condition in the “ON” section)

   most of the time anything more complex than comparing one columns from each of the tables being joined will have some performance impact.

   again measure, but in experience I find that even creating a common table expression, or a subquery to come up with the derived column and use it to join is faster than doing it in the JOIN predicate.

   also, the less columns involved in a JOIN the better.

2) always prefer set operations than row-by-row operations

    for the same reason that cursors should be avoided

    what many don’t know however is that if you have a UDF (user-defined function) used in a SELECT query result set then that function will be called for every row and thus there will be an overheed. Again measure but for performance sensitive cases it will be significant.

     so if you have to squeeze out any performance gain you can have then get rid of the function if you and although redundant have the logic on the query itself.

     maintainability wise this is not advisable. so you have to feel the trade-off here

     For example:

     SELECT fullname = dbo.GetFullName(FirstName, LastName) FROM Employee (NOLOCK)

     regardless of the complexity of the GetFullName function’s implementation would be slower than having it handled in the query directly. say

     SELECT fullname = FirstName + ‘ ‘ + LastName FROM Employee (NOLOCK)

Hope to share more soon. But again, don’t forget, measure3x

adding custom SSIS transformation to visual studio toolbox fails

Just very recently I encountered an issue in deploying a custom SSIS component assembly where I cannot add a custom SSIS transformation to the Visual Studio toolbox.

It turns out to be a relative “no-brainer” error if only the clues were more straightforward. Basically after deploying the assembly I could not find my component listed in the “SSIS Data Flow Items” tab list. [more]

There are a number of articles available online on creating custom SSIS objects (control flow tasks, source, destination, data transformation etc).

Here are a few :

Extending SSIS Part 1 – Creating a Custom Data Flow Transformation Component

Developing a Custom Data Flow Component

Developing a Custom Transformation Component with Synchronous Outputs and searching say using keyword “adding custom SSIS transformation” should return more.


At Agilone, we’ve been developing SSIS components to complement our products and one of the issues I’ve faced recently is when I was deploying the custom SSIS components (the assembly) and it’s time to add it to the toolbox (Choose Items > SSIS data flow items tab (since the component was a data flow item)) I was unable to see the component (or any component on the assembly I just added to GAC and the PipelineComponents folders).

I tried these (in different orders and to no avail but for completeness including in the list):

1) close BIDS (Visual Studio Shell) or reset toolbox and close BIDS

2) uninstall the assembly in GAC and reinstall the assembly in GAC

3) make sure that assembly is on the correct folder (e.g. DTSPipelineComponent – here are more details on deployment and testing – Deploying and Testing Custom SSIS Components)

4) restart SSIS service

5) reopen the project and try to re-add the component in the toolbox again.

As mentioned it didn’t do the trick and took me sometime to figure it out.

It turns out that there was a problem with the assembly. I built the version referencing some SQL 2005 libraries (DLL), specific version = true and when I deployed it to that server (which only had SQL 2008 DLLs) it failed silently and simply didn’t appear on the “SSIS Data Flow Items” tab.

There are more complex ways to figure out why it doesn’t appear in the said tab but I would like to share the more straightforward one.

After the step where you clicked Choose Items (for toolbox) when the dialog window appears instead of changing to the “SSIS Data Flow Items” tab, stay on the .NET Components. And then click browse and browse for the assembly which contains your custom SSIS component

If the assembly chosen doesn’t contain an SSIS component then it will say so.

However if the assembly contains errors (e.g. missing referenced assemblies among others) then it will also show an error dialog saying so.

That’s it. I hope it helps and saves you some time (since I wasn’t able to find this information quickly – for a good “googler”)

And for the more “complex” method I believe you can you fuslogvw or similar assembly binding logger/monitoring tool to figure out any issues on assembly binding (missing references) such as the one I encountered. But I suggest doing he “browse” test first before exploring those debugging techniques.

And finally, of course as for any project check that you have the right assemblies referenced, whether you want specific version true/false and make sure that all dependencies are in place.


blog site update

I have recently consolidated some of the sites I maintain into one of my hosting accounts. (will post about the provider soon – so far so good)

And that includes this blog. Along with this move I have also updated to the latest version of BlogEngine.NET – guilty of not being up to date for a couple of reasons:

Anyways, here are some notable changes: [more]

1) posts url are not longer timestamped – i.e. blog/post/yyyy/mm/* still works but new links are now just blog/post/*

    – i just noticed earlier actually that when i upgraded to v1.6 i revised something to make the URL pattern yyyy/mm instead of the default yyyy/mm/dd. this caused all old post urls linked from outside to not work. good thing there is a setting in the BE.NET admin that indicates to not include timestamps in the URL. and as a side-effect the yyyy/mm pattern worked fine.

    – along with the issue mentioned above, google’s crawl also failed because of the missing pages and thus affected rankings and statistics. hopefully it should stabilize soon

    – on this note, i had to manually set custom crawl rate in google webmaster site due to these changes

2) other BlogEngine.NET new stuff for version 1.6 – most important of which is the Akismet anti-spam comment filter. Clearly BE.NET users have been plagued by spam recently.

    I have made changes but not to BlogEngine Core so hopefully future updates will be more seemless.

SQL Business Intelligence Developer Needed (Manila, Philippines)

We are currently looking for a SQL BI Developer Professional to work with us on exciting, high-profile/scale projects. Feel free to contact me or visit http://www.lwsmedia.com/contact.htm. Looking forward to work with you. [more]

Company Profile:

Agilone LLC
Norwalk, CT USA, Los Gatos CA USA , Istanbul, Turkey  and Manila, Philippines

Companies who can effectively understand, process and take value from their data gain a sizable competitive advantage.  However, many fail to do so since the amount of data captured by organizations is growing more quickly than the capabilities of the tools to analyze it.  Agilone solves this problem with proprietary, SAAS-based analytical tools that help them make data-driven marketing decisions that drive superior results.
Agilone’s goal is to help clients develop and execute data-driven marketing strategies. We provide clients with advanced technology and analytical marketing services to help identify and execute opportunities hidden in their data.  Our approach is effective because we analyze more of their data than the competition and offer a customized solution.
Typical engagements are in areas of customer valuation, pricing, response modeling, segmentation, with follow-on implementation of technologies such as data warehousing, web-based application development, database management and business intelligence services.
We are headquartered in Norwalk, CT with a technology development and services center in Istanbul, Turkey.  Recently, we have opened our new office in Silicon Valley (Los Gatos) California.

We are a high growth, entrepreneurial company and are always looking for intelligent hardworking people to join our company.  Transfers between Istanbul office and US offices are possible and we do sponsor work visas and Greencards in US depending on tenure and performance of employees.
Job Description:
This position is for our Manila Office.   Due to the increase in our business volume we need additional SQL developers with the following background and skill set:
Position Responsibilities:
– Be part of the developer team from Analysis to design, programming, testing and deployment
– Design and administer project related databases
– Create technical documentation
– Perform unit test of the codes
– Author user manuals and installation guides
Minimum: BS, BA or equivalent, very good command of written and spoken English
Required Skills and Experience:
1. Excellent communication skills both written and verbal
2. Competent in T-SQL, working knowledge of MS SQL 2005 and MS SQL 2008.
3. Experience in Dimensional Databases and OLAP Cubes.
4. Desire to learn new platforms and environments as the projects leads.
5. Strong analytical and problem solving skills with attention to detail.
6. Self-motivated – comfortable working in a fast paced environment with limited direction.
7. Ability to multi-task and work on for several different projects.
8. Holding a Microsoft certification is a plus.
We will conduct 2 tests before an interview.  The first test would be a general aptitude test, after you pass this test you will be given a more specific T-SQL test.  After these tests you will be invited for an interview with the Director of Business Intelligence and the Principal of the company.

Register for Visual Studio 2010 Beta Exams (.NET 4.0)

Time for (free) Beta Exams again

Information can be found from here


Register via http://register.prometric.com

For easier reference, exams available are listed below (along with PromoCode)

Exam 71-511, TS: Windows Applications Development with Microsoft .NET Framework 4 – 511BC

Exam 71-515, TS: Web Applications Development with Microsoft .NET Framework 4 – 515AA

Exam 70-513: TS: Windows Communication Foundation Development with Microsoft .NET Framework 4 – 513CD

Exam 70-516: TS: Accessing Data with Microsoft .NET Framework 4 – 516B1

Exam 70-518: Pro: Designing and Developing Windows Applications Using Microsoft .NET Framework 4 – 518PE

Exam 70-519: Pro: Designing and Developing Web Applications Using Microsoft .NET Framework 4 – 519ZS

And as always, word of advice, read and master the items in prep guide to increase your chances of passing the exams.

Good luck!

DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component “Fuzzy Lookup” (60) failed with error code 0xC0202009 while processing input “Fuzzy Lookup Input” (61)

Encountered this error in one of recent tasks involving SSIS Fuzzy Grouping and Lookup.

[Fuzzy Grouping Inner Data Flow : SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component “Fuzzy Lookup” (60) failed with error code 0xC0202009 while processing input “Fuzzy Lookup Input” (61). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.

[Fuzzy Grouping Inner Data Flow : SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component “OLE DB Source” (1) returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.

[Fuzzy Grouping [800]] Error: A Fuzzy Grouping transformation pipeline error occurred and returned error code 0x8000FFFF: “An unexpected error occurred.”.

This is the only relevant error message or code that I got and didn’t seem quite helpful. [more]

And it appears intermittently at first, until I noticed that the error occurs at a certain number of records and although not exactly repeatable, very close to that value. around 4M rows.

While troubleshooting, tried increasing available memory (released some memory locked for another application) and the error occurred when the number of input records increased.

Tested further and the behavior seemed consistent. And it seemed that the size of the input is proportional to the memory (RAM) used. Calculated estimated size of ever row I had to be at 1KB. Tried processing 10M rows and freed up at least 10GB of RAM and it worked fine.

I can try to work-around so that I will process smaller sets/batches so i push 10M in one pass but just needed to figure out what was causing it and although not very clear from the error message I think increasing free memory did the trick for me.

Sharing in case someone else runs into a similar error. Hope this helps