Akira's ramblings
Sunday 24 November 2013
Changing the colours
A small change on colours (and widths) to make the blog a bit more readable. Didn't realize earlier that it was so annoying to read it..
Messing around with Linux
A year and a half ago I tried to set up a Linux server. I had some problems with ethernet cards, and after a lot of back and forth the motherboard or the power went poof. Wasn't very happy at the time.
So, after a long time in which I couldn't do anything with it, I bought a new Corsair CX430, put together some old components that I have laying around from when I had a gaming machine (good times), and slap it all together.
It was time to get back into building the system, especially now that I had a week of work coming and wasn't going anywhere as my girlfriend was working. Last weekend I tried to install several Distros, all without success:
* Centos 6.2 wouldn't even start the installer
* Centos 6.4 wouldn't even start the installer
* FreeBSD will do the install, and then will not be able to load
* Open Suse will do the install (after a few tries) and then will not be able to load
* Debian 7.2 will do the install and then will not be able to load
Of all the one that put me closer (and it was easier) was the Debian installation. So this weekend I tried again.
I have never had a problem installing Windows in any machine. Install and then start working. That was it. Most of the distros that I tried are either general or server, so I really didn't expect that it was going to be as easy as installing a client OS. I expect that Ubuntu would not have given my problems either.
The first thing that I had to do is disconnect two of my three hard drives. For reasons to do with the boot loader and some esoteric something-or-other, both OpenSuse and Debian were getting their knickers twisted. Then I had to get a Debian Live CD to start it on fail safe mode so I could update the Bios of my Asus mobo (first time ever that I'm forced to update it), which thanks to flashrom was a breeze.
And then, finally, got Debian 7.2 installed. I haven't really used Linux in a long while (the first time was in the mid 90-s, when I installed Slackware'96 as a folder of my Win OS and managed to delete half of my Windows installation). So much easy to get the information you need now than it was then.
I discovered that, at least on Debian, the user that you create on the installation doesn't have sudo permissions. That strikes me as odd, as it is supposed to be the user for the administrator, so he doesn't have to use directly the root user. But, it is easy to set up.
Following was to be able to remote using ssh from my Mac to the server. As I had already selected the SSH package when installing Debian, it was very easy. Well, once I discovered that because I did not have a local DNS server, I had to use the IP.
Next part, get the server setup as local DNS provider. Which it was relatively simple using bind9. A few config files, a few head scratchers, forgetting to add the dns server in my Mac, ... and there it goes: Local DNS.
Now I have Git installed, Mercurial installed and is time to put Subversion. I like the idea of using a few different VCS (I am the admin of our TFS installation at work). Bugzilla, X-Planner and Jenkins. Just in time to start working on my Obj-C project (which I still have to see if I can make it compile on the server with Jenkins).
I must say that I am really excited with the idea of having a linux server running at home.
So, after a long time in which I couldn't do anything with it, I bought a new Corsair CX430, put together some old components that I have laying around from when I had a gaming machine (good times), and slap it all together.
It was time to get back into building the system, especially now that I had a week of work coming and wasn't going anywhere as my girlfriend was working. Last weekend I tried to install several Distros, all without success:
* Centos 6.2 wouldn't even start the installer
* Centos 6.4 wouldn't even start the installer
* FreeBSD will do the install, and then will not be able to load
* Open Suse will do the install (after a few tries) and then will not be able to load
* Debian 7.2 will do the install and then will not be able to load
Of all the one that put me closer (and it was easier) was the Debian installation. So this weekend I tried again.
I have never had a problem installing Windows in any machine. Install and then start working. That was it. Most of the distros that I tried are either general or server, so I really didn't expect that it was going to be as easy as installing a client OS. I expect that Ubuntu would not have given my problems either.
The first thing that I had to do is disconnect two of my three hard drives. For reasons to do with the boot loader and some esoteric something-or-other, both OpenSuse and Debian were getting their knickers twisted. Then I had to get a Debian Live CD to start it on fail safe mode so I could update the Bios of my Asus mobo (first time ever that I'm forced to update it), which thanks to flashrom was a breeze.
And then, finally, got Debian 7.2 installed. I haven't really used Linux in a long while (the first time was in the mid 90-s, when I installed Slackware'96 as a folder of my Win OS and managed to delete half of my Windows installation). So much easy to get the information you need now than it was then.
I discovered that, at least on Debian, the user that you create on the installation doesn't have sudo permissions. That strikes me as odd, as it is supposed to be the user for the administrator, so he doesn't have to use directly the root user. But, it is easy to set up.
Following was to be able to remote using ssh from my Mac to the server. As I had already selected the SSH package when installing Debian, it was very easy. Well, once I discovered that because I did not have a local DNS server, I had to use the IP.
Next part, get the server setup as local DNS provider. Which it was relatively simple using bind9. A few config files, a few head scratchers, forgetting to add the dns server in my Mac, ... and there it goes: Local DNS.
Now I have Git installed, Mercurial installed and is time to put Subversion. I like the idea of using a few different VCS (I am the admin of our TFS installation at work). Bugzilla, X-Planner and Jenkins. Just in time to start working on my Obj-C project (which I still have to see if I can make it compile on the server with Jenkins).
I must say that I am really excited with the idea of having a linux server running at home.
Sunday 29 September 2013
VB6 vs the modern world
A long, long time ago I worked on VB6. I was fixing, mostly, a bookkeeping application that someone before me ( this time it was actually true ;-) ) left in a really bad state. It was developed as a bespoke application for a professional college. After my normal hours I would have to go to their offices to work alongside the accountant to get it working correctly. It was worst when the summer came, as I would have a half working day (which mean reduced holidays), so after leaving work and enjoying the afternoon and early evening I had to go to the client's office to continue with the bug fixes. At the end of that summer I moved to England, and after a few years here working I decided I was not going back to Spain as an employee. Even with the better quality of life outside work.
Back into the main point of this post, VB6. Oh, what memories!
I used to be annoyed at the option of not declaring variables and their types. I would make sure that the option to force declaration was selected. Now that I am learning properly a dynamic language (Python) I see the advantages (and still see the disadvantages). Was the language team of VB6 into something?
First time I hit on the maximum length of a subroutine I was angry. "What kind of shit is this!!!!" I exclaimed. Actually, it was: "¡¡¡Que pedazo de mierda!!!". So I got the ultra massive function divided in two consecutive ones. I have been always good a keeping on my mind only the important things (or you could say that I have bad memory, which is true as well), so ultra massive methods will not slow me down that much, as I never tried to understand the whole of it. Neither I was dismayed because of the size of a method. One line at a time to rule the world. But, oh the joy of discovering Clean Code. Making small, single purpose functions. The code becomes so much easy to read, to understand. I wonder again, was the language team of VB6 into something?
My last big annoyance was the lack of proper recursion calls. You had to create two different methods that would call each other. Taking into account that recursion is my favourite construct, that was painful. I wasn't even asking for tail recursion optimisation (ok, back then I did not know what it was). Oh, well, order is restored and VB6 still ranks as the worst language I have worked on. Which doesn't mean that it wasn't useful. It's simplicity and easiness to create windows app at that time was only matched by Delphi (of the languages that I did now)
Tuesday 4 June 2013
Symbiosis
Well, it has taken me more than expected to write this one. Not because was difficult to find the idea, or put it into words, but mostly because I kept doing other things ... and also because I was a bit lazy. One day I will be able to reliable write once a week (I don't need/want more)
When I was at university I read Extreme Programming Explained by Kent Beck. I was being taught at that time the standard development processes
When I was at university I read Extreme Programming Explained by Kent Beck. I was being taught at that time the standard development processes
(Waterfall, Iterative, ...). What it was advocating was barely touched at University (don't remember then saying anything about testing, or showing the results to the stockholders, or anything similar). It did leave an impression. And was on the back on my mind once I graduated and started to work.
After a few years of working I was in a position to start moving my department into a fully agile environment (we are still in our way to be an agile shop). I introduced unit testing, taught the guys about refactoring (and our code being C# Ctrl R + M is your friend), we will be
After a few years of working I was in a position to start moving my department into a fully agile environment (we are still in our way to be an agile shop). I introduced unit testing, taught the guys about refactoring (and our code being C# Ctrl R + M is your friend), we will be
introducing soon acceptance testing, continuous integration and TDD. We have one project being worked under Scrum.
Around the same time that we started adding unit testing I read Clean Code by Unce Bob et al. This book, same as Extreme Programming Explained, left some big impression (although this time I was able to start applying its concept as soon as I put the book down). Power through simplicity. I had the experience to see that big methods are problematic.
Let's add to the mix SOLID, YAGNI, DRY, ..., stir and:
Progressively I realized that the old adagio was still valid: "The whole is greater than the sum of its parts". Each part supports at least one other, in some kind of circular graph. Clean code facilitates unit testing, TDD leads to clean code, unit testing facilitates refactoring, refactoring allows you to create clean code, working under SOLID makes testing simpler, clean code facilitates SOLID, refactoring facilitates DRY
There is a symbiosis formed out of the parts, one feeding from other, and being a feeder itself.
And it comes back to what Kent Beck said on his book (paraphrasing): If each part is useful, using all parts must be good.
Around the same time that we started adding unit testing I read Clean Code by Unce Bob et al. This book, same as Extreme Programming Explained, left some big impression (although this time I was able to start applying its concept as soon as I put the book down). Power through simplicity. I had the experience to see that big methods are problematic.
Let's add to the mix SOLID, YAGNI, DRY, ..., stir and:
Progressively I realized that the old adagio was still valid: "The whole is greater than the sum of its parts". Each part supports at least one other, in some kind of circular graph. Clean code facilitates unit testing, TDD leads to clean code, unit testing facilitates refactoring, refactoring allows you to create clean code, working under SOLID makes testing simpler, clean code facilitates SOLID, refactoring facilitates DRY
There is a symbiosis formed out of the parts, one feeding from other, and being a feeder itself.
And it comes back to what Kent Beck said on his book (paraphrasing): If each part is useful, using all parts must be good.
Now I look at my previous code, things that I have written in the last few years, and I see so many places where it can be improved, where it can be made easier to read, to understand, to modify, ...
Before I though that programming was most probably a craft. Now I strongly believe so. You need to learn the tools of the trade, where those tools are not the software applications that you use, but the ideas, the knowledge, all the thinking that has done before you. And then you have to hone your skills based on them, choose what is suitable or not, adap it to your (or your shop/department) so the software that you write stands against any adversity that befalls it.
Thursday 16 May 2013
Stylecop and NUnit revisited
Couple of points from the previous post:
"At the moment, I still need to install my rules.dll on the installation folder of StyleCop, otherwise new rules will not appear on the settings. I have the suspicion that is because the StyleCop editor uses the dlls that are located on the same folder. I will have to test that assumption."
The assumption was correct. Good, as I don't have to close Visual Studio when wanting to test a new rule.
"At the moment I am using the output path as the location of the source code/dll/settings. I am trying to set a specific folder that will not depend on where you have the solution (using pre/post build events to make sure that it exists and has all the information), but then NUnit on Resharper throws a tantrum that I have the same dll in two different places (one that specific folder, the other the temp folder). Not sure why it doesn't fail when the location is the output path."
It doesn't fall down when used directly from NUnit (instead of using the Resharper ability inside Visual Studio). So will not pursue a solution for the time being (though probably will not be long when I have to do it).
"At the moment, I still need to install my rules.dll on the installation folder of StyleCop, otherwise new rules will not appear on the settings. I have the suspicion that is because the StyleCop editor uses the dlls that are located on the same folder. I will have to test that assumption."
The assumption was correct. Good, as I don't have to close Visual Studio when wanting to test a new rule.
"At the moment I am using the output path as the location of the source code/dll/settings. I am trying to set a specific folder that will not depend on where you have the solution (using pre/post build events to make sure that it exists and has all the information), but then NUnit on Resharper throws a tantrum that I have the same dll in two different places (one that specific folder, the other the temp folder). Not sure why it doesn't fail when the location is the output path."
It doesn't fall down when used directly from NUnit (instead of using the Resharper ability inside Visual Studio). So will not pursue a solution for the time being (though probably will not be long when I have to do it).
Wednesday 15 May 2013
Basic unit testing of Stylecop with NUnit
The Set up
As part of the changes I have managed to introduce on my department, we have a specific coding guideline and we use StyleCop to enforce it. Actually, until now enforce it only in part, as we did not have time to set up the new rules that we have agreed on. Now I have the time to set them up correctly.
Of course, first one that I created didn't work. And it was time to bring on the testing (which I should have done from the beginning, still coming to grips with TDD). And Oh Boy!, was it complicated. There is not much explanation of what you need to do and the two examples that I located (Stylecop source code, and Stylecop Contrib are far more complicated that what I wanted. Especially as they seemed to use files for testing that could have more than one violation, and then they filter the violations to see if the one they are looking for is there.
What I want? Easy, to be able to test a single rule (nothing else).
The Solution (basic approach)
The basic code that I created is set below (look further down for an explanation). Of course, that is the most basic, and as soon as you have two test methods, you will want to extract the code.
Explanation
First, the filepath is for the code that we want to test. LocationOfCodeUsedForTesting is just a variable with the location.
StyleCop settings are the settings that we want to use. If we pass null on the following line, it will use the default installation. Personally I have set up only active the rules that I am creating, and then I select to not use the parent settings.
The StyleCopConsole is the object that will analyze the data. The first parameter is the settings that we want to use (as said before, null for the ones present on the installation of StyleCop). The null represents the output path. If we specify an actual path, it will try to write output there (but for testing, I wasn't interested). The fourth one represents the location of the dlls for StyleCop and our own rules.
The CodeProject object will represent a project. Not sure the second parameter is needed (and cannot test it now). The first one is a unique identifier, using a Guid because is just a throw away identifier just to run test.
It is a funny way to add a source code to a project. We call the environment inside the StyleCopConsole and pass the project to which we want to add the source code and the location of the source code.
We then need to hook to the ViolationEncountered event, so we can get a list of violations.
And then we analyze the code (console.start).
To do the checks, I am using FluentAssertions for NUnit.
Facts and next steps
There are a few things that I found interesting/baffling/in need of research:
Probably, because I am using Resharper, the executing assembly when I run/debug the tests is in a temporal folder. I cannot use that temporal folder to store the source code/dll/settings. For some reason it fails (I think there could be a problem of permission).
I couldn't use the default location of the stylecop.dll (which is where you have to put your own rules.dll as well), because that requires that every change I make I have to close Visual Studio and reopen it.
At the moment I am using the output path as the location of the source code/dll/settings. I am trying to set a specific folder that will not depend on where you have the solution (using pre/post build events to make sure that it exists and has all the information), but then NUnit on Resharper throws a tantrum that I have the same dll in two different places (one that specific folder, the other the temp folder). Not sure why it doesn't fail when the location is the output path.
The files that I use for testing have a Build action of "None" and have copy to the output folder if newer. While running StyleCop inside Visual Studio it will not analyze files that do not have the Build action set to "Compile". Which puzzled me for a few seconds (why my tests pass, but then it doesn't actually found anything on the file while running normally?)
At the moment, I still need to install my rules.dll on the installation folder of StyleCop, otherwise new rules will not appear on the settings. I have the suspicion that is because the StyleCop editor uses the dlls that are located on the same folder. I will have to test that assumption.
Conclusion
Still a few things to be done, but the basics for setting up testing of StyleCop rules are there. I like the approach more than the other two mentioned (StyleCop and StyleCop Contrib), because I prefer to have a single file with a single issue that is easy to test. Unit tests should only test a single thing without noise.
Well, hope the information is helpful for you. I will add some information in the future when I solve the couple of stumbling blocks that I have.
As part of the changes I have managed to introduce on my department, we have a specific coding guideline and we use StyleCop to enforce it. Actually, until now enforce it only in part, as we did not have time to set up the new rules that we have agreed on. Now I have the time to set them up correctly.
Of course, first one that I created didn't work. And it was time to bring on the testing (which I should have done from the beginning, still coming to grips with TDD). And Oh Boy!, was it complicated. There is not much explanation of what you need to do and the two examples that I located (Stylecop source code, and Stylecop Contrib are far more complicated that what I wanted. Especially as they seemed to use files for testing that could have more than one violation, and then they filter the violations to see if the one they are looking for is there.
What I want? Easy, to be able to test a single rule (nothing else).
The Solution (basic approach)
The basic code that I created is set below (look further down for an explanation). Of course, that is the most basic, and as soon as you have two test methods, you will want to extract the code.
[Test()]
public void Test()
{
String filePath = LocationOfCodeUsedForTesting;
String styleCopSettings = Path.Combine(LocationOfDlls, "StyleSettings.StyleCop");
StyleCopConsole console = new StyleCopConsole(styleCopSettings, false,null, new List()
{
LocationOfDlls
}
, true);
CodeProject project = new CodeProject(Guid.NewGuid().GetHashCode(), LocationOfDlls, new Configuration(new String[0]));
Boolean result = console.Core.Environment.AddSourceCode(project, filePath, null);
List violations = new List();
console.ViolationEncountered += (sender, args) => violations.Add(args.Violation);
console.Start(new List() { project }, true);
violations.Count.Should().Be(1);
violations[0].Line.Should().Be(16);
}
Explanation
First, the filepath is for the code that we want to test. LocationOfCodeUsedForTesting is just a variable with the location.
StyleCop settings are the settings that we want to use. If we pass null on the following line, it will use the default installation. Personally I have set up only active the rules that I am creating, and then I select to not use the parent settings.
The StyleCopConsole is the object that will analyze the data. The first parameter is the settings that we want to use (as said before, null for the ones present on the installation of StyleCop). The null represents the output path. If we specify an actual path, it will try to write output there (but for testing, I wasn't interested). The fourth one represents the location of the dlls for StyleCop and our own rules.
The CodeProject object will represent a project. Not sure the second parameter is needed (and cannot test it now). The first one is a unique identifier, using a Guid because is just a throw away identifier just to run test.
It is a funny way to add a source code to a project. We call the environment inside the StyleCopConsole and pass the project to which we want to add the source code and the location of the source code.
We then need to hook to the ViolationEncountered event, so we can get a list of violations.
And then we analyze the code (console.start).
To do the checks, I am using FluentAssertions for NUnit.
Facts and next steps
There are a few things that I found interesting/baffling/in need of research:
Probably, because I am using Resharper, the executing assembly when I run/debug the tests is in a temporal folder. I cannot use that temporal folder to store the source code/dll/settings. For some reason it fails (I think there could be a problem of permission).
I couldn't use the default location of the stylecop.dll (which is where you have to put your own rules.dll as well), because that requires that every change I make I have to close Visual Studio and reopen it.
At the moment I am using the output path as the location of the source code/dll/settings. I am trying to set a specific folder that will not depend on where you have the solution (using pre/post build events to make sure that it exists and has all the information), but then NUnit on Resharper throws a tantrum that I have the same dll in two different places (one that specific folder, the other the temp folder). Not sure why it doesn't fail when the location is the output path.
The files that I use for testing have a Build action of "None" and have copy to the output folder if newer. While running StyleCop inside Visual Studio it will not analyze files that do not have the Build action set to "Compile". Which puzzled me for a few seconds (why my tests pass, but then it doesn't actually found anything on the file while running normally?)
At the moment, I still need to install my rules.dll on the installation folder of StyleCop, otherwise new rules will not appear on the settings. I have the suspicion that is because the StyleCop editor uses the dlls that are located on the same folder. I will have to test that assumption.
Conclusion
Still a few things to be done, but the basics for setting up testing of StyleCop rules are there. I like the approach more than the other two mentioned (StyleCop and StyleCop Contrib), because I prefer to have a single file with a single issue that is easy to test. Unit tests should only test a single thing without noise.
Well, hope the information is helpful for you. I will add some information in the future when I solve the couple of stumbling blocks that I have.
Saturday 20 April 2013
Shaving
I have a Gillete Match 3 Turbo. Is a superb razor. It does a very good job of leaving my skin smooth, devoid of hair. But using it makes my skin sore and sometimes I cut myself.
I have a Wilkinson Sword Hydro 3. The cut is not as good as the Gillete. But my skin barely suffers, and I just can't cut myself.
Finally, I have a beard trimmer. Which leaves a nice stubble. And, unlike the other two, I don't need to have my face soaking wet (ideally after a shower), to reduce the chaos on my face.
I use the first one on very special occasions or when there is a requirement of formality (weddings, interviews)
I use the second one when I need to take out my beard, but is not a "big" occasion (going to a party, visiting family, ...). Also I use it if I haven't shave in a long while as a preparation, the day before, for using the first one.
I use the third one for keeping the face relatively tidy, on when I go to standard social events (a dance night or visiting friends)
As an aside, I had at some point a Wilkinson Quatro that I had to stop using because, due to the thickness of my hair, will clog up constantly (the space between the blades was just not big enough)
I have three grooming tools that I use depending on the circumstances.
And yet, on four different companies I have worked on, we had a single programming language to do everything. Disregarding what was the best tool for the job.
I have a Wilkinson Sword Hydro 3. The cut is not as good as the Gillete. But my skin barely suffers, and I just can't cut myself.
Finally, I have a beard trimmer. Which leaves a nice stubble. And, unlike the other two, I don't need to have my face soaking wet (ideally after a shower), to reduce the chaos on my face.
I use the first one on very special occasions or when there is a requirement of formality (weddings, interviews)
I use the second one when I need to take out my beard, but is not a "big" occasion (going to a party, visiting family, ...). Also I use it if I haven't shave in a long while as a preparation, the day before, for using the first one.
I use the third one for keeping the face relatively tidy, on when I go to standard social events (a dance night or visiting friends)
As an aside, I had at some point a Wilkinson Quatro that I had to stop using because, due to the thickness of my hair, will clog up constantly (the space between the blades was just not big enough)
I have three grooming tools that I use depending on the circumstances.
And yet, on four different companies I have worked on, we had a single programming language to do everything. Disregarding what was the best tool for the job.
Subscribe to:
Posts (Atom)