Tuesday, December 19, 2017

T4 Generation Templates VS 2017

Hi,

It took me a lot of time to find good resources about T4 (even if there are no real alternatives built-in for visual studio). Here a small template to create multiple files easily...

I used following resources exists using archive.org:

Further resources:


kr,
Daniel

<#@
template debug="false" hostspecific="true" language="C#" #><#@

assembly name="System.Core" #><#@

import namespace="System.Linq" #><#@
import namespace="System.Text" #><#@
import namespace="System.IO" #><#@
import namespace="System.Collections.Generic" #><#@

output extension=".log" #><#
CleanOutputFolder();

string propertyName = "Prop";
           
Enumerable.Range(1,3).ToList().ForEach(id => {

StringBuilder builder = new StringBuilder();
builder.AppendLine("public class Data" + id.ToString("00"));
builder.AppendLine(" {");
Enumerable.Range(1,3).ToList().ForEach(propId => {
builder.AppendLine(" public int "+propertyName+propId+" { get; set; }");
});
builder.AppendLine(" }");

CreateClass("Data"+id.ToString("00")+".cs", builder.ToString());
});


#><#+
public void CleanOutputFolder()
{
string templateDirectory = Path.GetDirectoryName(Host.TemplateFile);
string outputFolder = Path.Combine(templateDirectory, "output/");
foreach(var file in Directory.GetFiles(outputFolder))
{
File.Delete(file);
}
}

public void SaveOutput(string outputFileName)
{
      string templateDirectory = Path.GetDirectoryName(Host.TemplateFile);
      string outputFilePath = Path.Combine(templateDirectory, "output/", outputFileName);
      File.WriteAllText(outputFilePath, this.GenerationEnvironment.ToString());

      this.GenerationEnvironment.Remove(0, this.GenerationEnvironment.Length);
}

public void CreateClass(string fileName, string content)
{
#>
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace Test
{
<#= content #>
}

<#+
SaveOutput(fileName);
}
#>


Monday, December 18, 2017

Dumping objects in c#

Hi,

I had a look at different possibilities to .net objects. First I was like: "just serialize that stuff and everything will be fine", but the more I thought about it, the more I realized that I needed more...

First of all: LinqPad did a great job with its functionality to dump objects, which was very inspiring to me...

see:



... so what might be a good solution to embed into our code...?

Good fitting solution seemed to be ObjectDumper (unfortunately you can find different solutions with the same name in google):

ObjectDumper (version: stone-age)



ObjectDumper (version: old)



ObjectDumper (not a single-file-solution anymore, Options to ignore fields)



ObjectDumper (Finally the current version without a single commit in the last 2 years... same version as CodePlex?)



So I started further testing with the github version:


  • installation: copy Dumper.cs and DumperOptions
  • DumperOptions can be removed easily... only 1 line must be patched (the rest is just handover of the options object into the recursive call)


IEnumerable<FieldInfo> fields = XY ? Enumerable.Empty<FieldInfo>() : type.GetFields(BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic);

you can replace XY by a constant or a static field, so even the github version can be a single file solution again.

  • alternatively nuget is available (as already mentioned)
  • its a simple recursive solution based on GetFields and GetProperties
  • can read private data using reflection but can not handle static members
  • Every reference type is pushed into an object of class ObjectIDGenerator generating making it easy to make cross-references of objects (e.g.: backing field and property)
    • not a feature I really need, so I tried to delete it, but found out that it is a perfect solution to remove circular stepping into the code, so it is not only for printing purpose.
For me it works perfectly.

The following Test shows a first impression:

        public class Data
        {
            private int secret = 15;
            private string secondSecret = "15";
            public static string staticData = "123";
            public int Public => secret * 2 + 3;
            public string Something => "something";
            public int StoredData { get; set; }
            public string StoredData2 { get; set; }
            public Data()
            {
                this.StoredData = 1;
                this.StoredData2 = "1";
            }
            public Data2 data2 = new Data2();
            public class Data2
            {
                public bool IsData { get; set; } = false;
            }
        }

Dumped:

#1: data [DumpTester.Program+Data]
{
   properties {
      Public = 33 [System.Int32]
      #2: Something = "something" [System.String]
      StoredData = 1 [System.Int32]
      #3: StoredData2 = "1" [System.String]
   }
   fields {
      secret = 15 [System.Int32]
      #4: secondSecret = "15" [System.String]
      <StoredData>k__BackingField = 1 [System.Int32]
      <StoredData2>k__BackingField = "1" [System.String] (see #3)
      #5: data2 [DumpTester.Program+Data+Data2]
      {
         properties {
            IsData = False [System.Boolean]
         }
         fields {
            <IsData>k__BackingField = False [System.Boolean]
         }
      }
   }
}
you see:
  • no static info
  • AutoProperties with Backing-Fields
  • For reference-auto-properties you see the linkage between Prop and Field (see #3)
  • ValueTypes have no reference
  • Types in the system namespace are not dumped


kr,
Daniel

Thursday, November 9, 2017

benchmarking

Found a cool benchmarking nuget-package. It is called nbench and unfortunately I am not sure if the project is dead or not, but nevertheless the current state can add to the functional unit test the non-functional performance test which is quite cool.

It can be used for .net and .net core projects with a variety of possibilities. This project fills one of the gaps of a nightly compile run and might be helpful in many cases to keep quality high.

see: https://github.com/petabridge/NBench

IIS development and exceptions

In a production environment it is often a good idea to hide some errors of e.g.: a web-page to be able to work on the root cause of the error in the background that finally the customer will not identify some strange behavior as a bug.

Exactly the opposite is true for developers and testers. Find bugs! Don't cover them with any UI candy or stuff... and exactly in this trap I fell into...

see customErrors on msdn: https://msdn.microsoft.com/de-at/library/h0hfz6fc(v=vs.110).aspx

and for web api 2 see: https://msdn.microsoft.com/en-us/library/system.web.http.filters.exceptionfilterattribute(v=vs.118).aspx
https://docs.microsoft.com/en-us/aspnet/web-api/overview/error-handling/exception-handling

see also AppDomain.UnhandledException, Application.ThreadException / Application.SetUnhandledExceptionMode (for WinForms), DispatcherUnhandledException (for WPF), Application_Error (IIS Global asax)

Be careful to disable stuff that makes it hard to find bugs during testing :-) 

Tuesday, October 31, 2017

mstest with test cases

I am still wondering why there is not better support in mstest for test-cases as it is in nunit like described in http://nunit.org/docs/2.5/testCase.html ... nevertheless I wrote a code snippet which makes life bit easier... I assemble the data to test into the test-name which is not that bad as it might sound, because in some cases e.g.: testing mathematical functions you want to see the input data which is used directly and so you would write it into the name anyway... so we can use the name of the test and reflect over it like:

         public static int[] GetIntArrayFromName()
        {
            StackTrace t = new StackTrace(skipFrames: 1);
            var frames = t.GetFrames();
            string name = frames.First().GetMethod().Name;
            return name.Split('_').Skip(1).Select(x => int.Parse(x)).ToArray();
        }

so if you call this function inside a test-method which is called something like Test_1_2_3 you will get an array like new[]{1,2,3} which might fit quite well.

            [TestMethod] public void Add_1() => AddMethod(GetIntArrayFromName());
            [TestMethod] public void Add_1_2() => AddMethod(GetIntArrayFromName());
            [TestMethod] public void Add_5_4_6_3_1() => AddMethod(GetIntArrayFromName());
            [TestMethod] public void Add_7_5_1_3_4() => AddMethod(GetIntArrayFromName());
           
the rest is copy / pasting which is easy...

just to mention it, there do is some kind of support using the data source attribute in mstest... see: https://msdn.microsoft.com/en-us/library/microsoft.visualstudio.testtools.unittesting.datasourceattribute.aspx and https://stackoverflow.com/questions/21608462/how-to-run-unit-test-with-multiple-datasource

Saturday, October 21, 2017

swagger integration into webapi project (Part 2 - .net core)

While trying to setup a test web-api solution in .net core I was wondering whether the swagger integration even works for .net core with the swashbuckle nuget and yes... it does work!

I used swashbuckle.aspnetcore (with .swagger / .swaggergen / .swaggerui)

the only things I had to add in startup.cs were:

ConfigureService:

  • addmvc
  • addmvccore
  • addapiexplorer
  • addswaggergen
    • swaggerdoc
    • includexmlcomments
Configure:
  • usemvc
  • useswagger
  • useswaggerUI
    • SwaggerEndpoint
done.

Every created controller will from now on be listed in swagger UI.

Friday, October 13, 2017

generic execution of stored procedures in c# accessing sql server

for generic execution of stored procedures I found some helpful links to generate code:

https://stackoverflow.com/questions/20115881/how-to-get-stored-procedure-parameters-details
https://raresql.com/2014/01/18/sql-server-how-to-retrieve-the-metadata-of-a-stored-procedure/
https://docs.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-exec-describe-first-result-set-transact-sql

... but in fact I want to pass in generic data without much of validation before hand, because I want to keep it RAD (rapid application development) and test it with integration tests.

Nevertheless with sqlfiddle.com we can validate that:
  create procedure t1(@x int) as
    select 1 as resultValue
  go

can be executed using (e.g.):
  exec dbo.t1 3;

parameters can be queried using
select 
   'Parameter_name' = name, 
   'Type'   = type_name(user_type_id),
   'Nullable' = is_nullable,
   'DirectionOut' = is_output,
   'Length'   = max_length, 
   'Prec'   = case when type_name(system_type_id) = 'uniqueidentifier'
              then precision 
              else OdbcPrec(system_type_id, max_length, precision) end, 
   'Scale'   = OdbcScale(system_type_id, scale), 
   'Param_order'  = parameter_id, 
   'Collation'   = convert(sysname,
                   case when system_type_id in (35, 99, 167, 175, 231, 239) 
                   then ServerProperty('collation') end)  ,
   system_type_id, user_type_id
  from sys.parameters
  where object_id = object_id('dbo.t1')
 
  order by param_order

(first) result record set meta data can be queried using
   SELECT * FROM sys.dm_exec_describe_first_result_set ('exec dbo.t1 3', NULL, 0) ;  


So execute a stored procedure and retrieve a datatable can be achieved using the code from:
https://stackoverflow.com/questions/25121021/generic-execution-of-stored-procedure-in-csharp

    public DataTable RunSP_ReturnDT(string procedureName, List<SqlParameter> parameters, string connectionString)
    {
        DataTable dtData = new DataTable();
        using (SqlConnection sqlConn = new SqlConnection(connectionString))
        {
            using (SqlCommand sqlCommand = new SqlCommand(procedureName, sqlConn))
            {
                sqlCommand.CommandType = CommandType.StoredProcedure;
                if (parameters != null)
                {
                    sqlCommand.Parameters.AddRange(parameters.ToArray());
                }
                using (SqlDataAdapter sqlDataAdapter = new SqlDataAdapter(sqlCommand))
                {
                    sqlDataAdapter.Fill(dtData);
                }
            }
        }
        return dtData;
    }

this link shows an easy way to map datatables and datarows to objects

https://www.exceptionnotfound.net/mapping-datatables-and-datarows-to-objects-in-csharp-and-net-using-reflection/

(things dapper is doing for us in general).

So this opens up a lot of opportunities for strongly typed argument objects (or an on-the-fly generated instance from a json-string) and output handling with a list of strongly typed instances mapped by datarows. There only needs to be a mapping between class and procedure name AND a mapping between argument fields and parameter names.


swagger integration into webapi project

In the (let's say) "early days" of .net's webapi the controllers and its operations could be listed (with the option to try the operation) using the nuget https://www.nuget.org/packages/Microsoft.AspNet.WebApi.HelpPage/ which is currently out of maintenance (I believe) because the last update was in february 2015 which is 2,5 years ago.

During some research I found a perfect alternative which seems to be the more or less official successor: https://www.nuget.org/packages/Swashbuckle .

There is a perfect tutorial from redgate related to swashbuckle at: https://www.red-gate.com/simple-talk/dotnet/net-development/visual-studio-2017-swagger-building-documenting-web-apis/

It works in 5 minutes and allows to generate REST API Clients which means perfect fit between client and server.

Things I changed (after installing the nuget):

  • c.DocumentTitle
  • c.IgnoreObsoleteActions
  • c.IgnoreObsoleteProperties
  • c.IncludeXmlComments

    with the function from the redgate blog-entry (add xml documentation in properties)

    protected static string GetXmlCommentsPath()
    {
                return System.String.Format(@"{0}\bin\webDemo.XML",
                    System.AppDomain.CurrentDomain.BaseDirectory);
    }

I don't needed to adapt the global.asax file from the root folder (and which does not work in a sub-folder which is very logical afterwards, but took me an 1 hour of research to find the bug).

In the global.asax (application_start) I still have:

  • simple-injector init (see: http://simpleinjector.readthedocs.io/en/latest/webapiintegration.html
  • GlobalConfiguration.Configure((config) =>
    {
      ((HttpConfiguration)config).MapHttpAttributeRoutes();
    });



kr,
Daniel

Wednesday, October 11, 2017

sql server - datetime to number

needed to transform minutes and seconds into a decimal number... it was not that easy as I thought originally... here the snippets:

I needed to find out the current hours, seconds and minutes

DATEPART(SECOND, CURRENT_TIMESTAMP)
DATEPART(MINUTE, CURRENT_TIMESTAMP)

DATEPART(HOUR,   CURRENT_TIMESTAMP)

(works for every part of the current timestamp...)

afterwards I needed to convert it to decimal and assemble the parts into one number using convert.

something like:
select convert(double(6, 2), @hour * 100 + @min) + convert(double(6, 2), @sec);

Now to cut off seconds we just need to convert the number to int again (as in the good old days of programming).

set @hourAndMin = convert(int, @hourAndMinAndSeconds)

Friday, July 14, 2017

SQL Server - sp_search_code

Hi,

the following stored procedure from the stone-age of sql server 7.0 is very helpful when searching for the usage of objects inside database code (up till now).

http://vyaskn.tripod.com/code/search_stored_procedure_code.txt
"Copyright © 1997 - 2002 Narayana Vyas Kondreddi. All rights reserved."



It queries the syscomments table so I would simplify the query to something like

   select * from syscomments c 
   where c.text like '%<TEXTTOCHECK>%' 

... but of course the check for encryption and object properties for filtering makes absolutely sense when you are allowed to deploy such a maintenance stored procedure. 

Wednesday, May 31, 2017

starting with polymer 2.0 in visual studio

I was looking for an alternative for angular 2/4 since I realized that typescript development is not so funny as it looks like... even not in visual studio... as an angular 1 developer I liked the features of angular but was kind of overwhelmed from the overhead I needed to setup the same thing in angular 2/4. So after some days of research I found polymer which fits perfectly.

With VS2017 we have full bower support which is great so just start with the download of the bower-package polymer (menu: project/bower-package management). The dependencies are installed automatically which makes it easy to startup my project. Additionally I added the bower_components folder into my project and started with an html page (more exactly with an asp.net page, but that does not really matter for this article).

On the same level as bower_components I added a folder elements where I wanted to place my polymer-elements. My "hello world"-example in here is a polymer-element called say-hello.html which says hello to a person who's name is added as a parameter.

The code of this html file is quite simple:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
<link rel="import" href="../bower_components/polymer/polymer-element.html">

<dom-module id="say-hello">
    
    <template>
        <style>
            :host {
                color: forestgreen;
            }
        </style>
        <div>Hello {{guy}}!</div>
    </template>
    <script>
        class SayHello extends Polymer.Element {
            constructor() { super(); }
            static get is() { return 'say-hello'; }
            static get properties() {
                return {
                    guy: { type: String, value: '<default_name>' }
                };
            }
        }
        customElements.define(SayHello.is, SayHello);
    </script>
</dom-module>
Without going into any further details say-hello.html can now print "Hello John!" in green to John if John is set as the parameter value of name in a call like <say-hello guy="John"></say-hello>

Very helpful resource for e.g.: the class definitions in javascript (ES6) is http://es6-features.org/#ClassDefinition which I used a lot.

kr,
Daniel

Friday, March 24, 2017

Fluent interface (API) in C#

Hi,

I was wondering why "fluent API" or "fluent interface" brings so many developer into trouble. I do like that kind of style, but I am not sure I would sacrifice my whole development style for this nice calling structure.

As a C# developer I found a way to go for me. I wrote an extension method on object-level like this:
1
2
3
4
5
6
7
8
    static class Fluent
    {
        public static T Do<T>(this T item, Action<T> method)
        {
            method(item);
            return item;
        }
    }

now i can work fluent like this:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
    class Program
    {
        static void Main(string[] args)
        {
            Console.Out
                .Do(x => x.WriteLine("test1"))
                .Do(x => x.WriteLine("test2"))
                .Do(x => x.WriteLine("test3"))
                .Do(x => x.WriteLine("test4"))
                .Do(x => x.WriteLine("test5"))
                .Do(x => x.WriteLine("test6"));
        }
    }

great.

It is probably not that expressive as it is in a real fluent interface, but it works for every single object in every single case...

Nevertheless see Martin Fowler's original post on FluentInterfaces: https://www.martinfowler.com/bliki/FluentInterface.html

kr,
Daniel

Saturday, March 18, 2017

Lightning fast (minimal) setup for a JSON Rest API using WebAPI

While development time it is often necessary to mock services or create services for test. These don't need to consider security or performance issues, but have to return valid responses.

So... 

  • Create ASP.NET-WebApplication (currently I used regular .NET framework application and not CORE)
    • consider to use Azure or not (I believe that for test-services no Azure usage will be the default)
  • Empty Template
  • Add NuGets
    • Microsoft.AspNet.WebApi and its dependencies
    • (Newtonsoft.Json for JSON Operations)
    • (System.Data.SQLite for in Memory or file Database implementing Linq and standard interfaces usable for any DB-Communication library)
    • (Dapper for DB-Communication)
  • Add a global.asax file
    • Add the following code to Application_Start
GlobalConfiguration.Configure((config) =>
                {
                    ((HttpConfiguration)config).MapHttpAttributeRoutes();

                    config.Routes.MapHttpRoute(
                        name: "DefaultApi",
                        routeTemplate: "api/{controller}/{id}",
                        defaults: new { id = RouteParameter.Optional }
                    );
                });
  • Add a new Item (WebApi-Controller) ... I called it in my example MyController
  • I removed the code generated and implemented the following example for demonstration
        [HttpGet]
        public IHttpActionResult GetData()
        {
            return Ok(new List<string> { "data1", "data2", "data3" });
        }
  • you can test this method (after starting in debugger) using the web browser
    see: http://localhost:63268/api/My/
    (Attention: the port is randomly generated by visual studio: check in your Properties, which port visual studio assigned you for the debugging sessions, or see which page is started when debugging session starts).
This small tutorial showed an easy way of creating a REST Api using WebApi. Consider using http://www.restapitutorial.com/ for further investigation into REST (especially which Http-Methods should be used for which operation) because using the wrong http-method can be really confusing.

If you added the nugets in the parenthesis you would be able to use better data stores than "List". By using:          
            data = new SQLiteConnection("Data Source=:memory:");
            data.Open();
            
... you can create an in-memory datastore with an initialized connection object inheriting IDbConnection which is the standard interface for SqlConnection, OdbcConnection and all the others like oracle and stuff...

Using dapper makes a simple Get-Request to a really simple one-liner. See:

        [HttpGet, Route("api/InMemory/Dapper")]
        public IHttpActionResult TestDapper()
        {
            return Ok(data.Query("select * from data"));
        }

Consider that this storage is non-persistent, but could be perfectly used for testing purposes.

kr, Daniel

Tuesday, February 7, 2017

check for interfaces in c#

today an interesting question about interfaces came up. Does reflection's GetInterfaces() return all interfaces of all hierarchical levels?

Test-Suite:
    public interface ILevel1{}
    public interface ILevel2 : ILevel1, IAdditionalInterface2{}
    public interface ILevel3 : ILevel2{}
    public interface IAdditionalInterface {}
    public interface IAdditionalInterface2 { }
    public class MyClass : ILevel3, IAdditionalInterface {}

The answer is yes: it does!

MyClass implements any interface which can be tested using the 
  • "is"-operator (INSTANCE is INTERFACE_TO_CHECK) or 
  • INSTANCE.GetType().GetInterfaces().Contains(typeof( INTERFACE_TO_CHECK ))
both methods return the same.

kr,
Daniel

Wednesday, January 25, 2017

FakeItEasy essentials

Today I had a closer look on FakeItEasy. A mockup (or faking) framework to create objects which can be configured freely from outside to be able to unit test classes working with these objects. Additionally it can create dummies (unneeded objects created to satisfy an interface).

FakeItEasy can be installed using nuget without any dependencies. Internally it uses the castle project (which makes me believe that FakeItEasy is more or less a super-powerfull dynamic proxy).


  • A.Fake<class or interface>(); (also CollectionOfFake)
    • tons of creation options can be added in overloaded Fake functions
      • e.g: WithArgumentsForConstructor, Implements,...
    • most interesting (for me) is .CallsBaseMethods so any object can be wrapped and be used with fakeiteasy magic
  • A.CallTo(...); // Properties: A.CallToSet
    • Arguments:
      • exact: "1", "xyz",..
      • by type: A<string>._
    • Conditions: WithReturnType, Where, To
    • ReturnValues: Throws, Returns, ReturnsNextFromSequence, ReturnsLayzily, ThrowsAsync, AssignOutAndRefParameters, AssignOutAndRefParametersLazily
    • Behaviors: DoesNothing, CallsBaseMethod, Invoke
    • Matchers: MustHaveHappened (Repeated), That.Matches(...)
  • Raise.Wtih


Restrictions:

  • can not be used with static or sealed classes 
  • methods that are not virtual or abstract can not be overriden

It took me about 3 hours to read the docs and to test my sample, but I haven't found any show stoppers... Going to use it in my tests and look forward to write more about it...

Saturday, January 21, 2017

Check for usb-sticks in windows

Hi,

I wanted to check in my app whether usb sticks are connected or not (automatic recognition)...

Some research later I found the following article:

https://www.codeproject.com/Articles/63878/Enumerate-and-Auto-Detect-USB-Drives

works quite straight forward ...

things to mention:
- for different use-cases it might be enough to check Win32_LogicalDisk where drivetype=2

- in wpf you might use something like:

1
2
3
4
5
6
        protected override void OnSourceInitialized(EventArgs e)
        {
            base.OnSourceInitialized(e);
            HwndSource source = PresentationSource.FromVisual(this) as HwndSource;
            source.AddHook(WndProc);
        }

- when you disconnect a stick and call the wmi to show you data Win32_DiskDrive / Win32_LogicalDisk you get exception ... sleep and retry works after about 5 seconds.

- you can check the result in command line using "wmic logicaldisk get"

kr,
Daniel

Wednesday, January 18, 2017

C# code generation for databases

Hi,

today I found SQLMetal.exe... https://msdn.microsoft.com/en-us/library/bb386987(v=vs.110).aspx

It is part of the visual studio installation and can create c# and vb code for different kind of db artefacts... I am curious how this works in comparison to Entity Framework (= T4).

kr,
Daniel

Tuesday, January 3, 2017

vim as a command line util

For quick find, edit, search and/or replace actions vi | vim | gvim is a perfect tool. A research question for me was: is it also a tool for automation? I wanted to search and replace strings with vi-syntax through a batch file (yeah... using windows makes the thing even more special) which can be scheduled or called on demand.

... and yes... it works. see: http://stackoverflow.com/questions/6945558/calling-search-and-replace-functions-for-vim-from-within-a-bash-script ... the trick is to start in "Ex-Mode" ... (see: documentation), but unfortunately it did not worked for me... I was not able to make it work in my script... no idea why... but:


then I found the -c option which is simple and works perfectly... -c executes a command like substitutions or other stuff and can be used in a chain of "-c"s.



my test-environment:
echo. >> data.txt
del data.txt
echo data data data > data.txt
cls

type data.txt
gvim -c %%s/data/0101/g -c wq data.txt
echo _____________________________________________________
type data.txt

it outputs data before the substitution and 0101 after it... perfect!

kr,
Daniel