Tuesday, January 01, 2008

Truncating your transaction log.

First, HAPPY NEW YEAR GUYS!!

Now, some actual blogging...

Here I was, trying to restore our metadata only database into a production server, getting constantly an error messaging saying that it was waaay to big for the available space. Well, my concern was that the entire back up of the database was only 6MB, and the database itself only contains tables with very basic data.

What is worst, the space available on that production drive was 14GB, so gosh, it simply didn't make any sense. Well, after reviewing the original metadata-only database I realized that our constant adding, updating and deleting of structures made our transaction log simply huge. (40gb)

So, after some reading I found this nice recipe to cut down the fat of your log database:

First, backup your existing log file by running
BACKUP LOG  TO DISK = ''

Now, shrink the transaction log, executing this
DBCC SHRINKFILE (, ) WITH NO_INFOMSGS

File name is the transaction log filename and target size you wanted to be, don't be too demanding on the shrinking, but as a hint, I ended up making mine only 1MB cause we don't need a log on our metadata db. :)


Update:
My apologies, the database in question is MSSQL 2005. Thanks for the feedback.

Thursday, November 15, 2007

Help Update 1 for Delphi 2007 available.

Another great sign of constant improvement of my favorite IDE.

If you have not get an information dialog from your installation just go to Programs\Code Gear RAD Studio\Check for updates.

You can go here to read the install and Release notes, straight from CodeGear.

Tuesday, October 23, 2007

Migrating from ADO to dbExpress...

I finally started.

After holding the migration for a long time I am doing it, the reason? well, CodeGear has a renew effort on working on dbExpress, and its latest version dbExpress 4 is a great example of it, with good performance, great connectivity with WIN32 and .Net, support for BlackFish SQL, code included and completely rewritten in Delphi, and lots of improvements on D2007 to work with it like a new SQL Query Editor.

My initial testings always showed a better performance with dbExpress than ADO, now add the fact that ADO is being discontinued by MS in benefit of ADO.Net, the choice was a lot easier.

But, what I did to migrate thousands and thousands of components? Well it wasn't easy, there was no way I could do it in the form designer, it will simply take years.

So just go, open your Data Module, switch to code view, and Find/Replace the following:

1. Replace Connection with SQLConnection.
2. Replace TADOQuery for TSQLQuery.
3. Replace TADOStoredProc for TSQLStoredProc.
4. Replace TADODataSet for TSQLDataSet.
5. Parameters for Params.
6. CmdStoredProc for ctStoredProc.
7. Direction for ParamType.
8. pdOutput for ptOutput.
9. pdInput for ptInput.
10. Add under every dataset that uses SQLConnection, Schema = "dbo" or whatever your schema is.
11. Remove every parameter item called "@RETURN_VALUE".
12. Replace TDateTimeField for TSQLTimeStampField.


Geesh, thats what I remember, I will add more to the list if I hit more rocks on the road.

Thursday, September 27, 2007

Consolas fonts.

Well, I was reading the CodeGear's non-technical newsgroups, and found an interesting conversation regarding what kind of font you use on your development IDE, among the recommendations there was the new Consolas Font from Microsoft. Well, I surrender to temptation, downloaded the font, set it up on my RAD Studio 2007 and expected to see something cool in front of me, but.... nothing happened, tried several times, and nothing happened, the IDE didn't recognize the font, nothing changed.

Luckily I did my googling and found out this nice tool, that avoided me the trouble to attempt to configure the font from the command prompt .

Well, I did it, restarted my sexy IDE and yahoooo!! I have a nice sexy font. I must admit I like it, it will take a while to feel fully comfortable with it, but i think is here to stay.

Try it! you may like it too.

Monday, August 13, 2007

Restarting your application.

We are creating a project to automatically update our different modules. That usually implies an application restart, lots of ideas went around, a service that does that, a batch file, or another program working as a proxy of the original one, but, hey, I found this G R E A T article from Zarko Gajic who is always giving amazing tips with Delphi on how to do it in a VERY easy way.


Enjoy!

Saturday, August 04, 2007

The overload directive.

Well, it is fairly common to use the Delphi's overload directive to create a method with different sets of parameters in an object.

What I wasn't aware is that it actually works on functions or procedures that are not part of an object. That is cool.

Thursday, August 02, 2007

RemObjects and DataSnap.

I'm using a mix of Delphi's RemObjects SDK and DataSnap on our middletier, and we have implemented session management on all the services and in all the DataSnap remote modules.

Thanks to RO, that can be easily done on the DataSnap modules by turning on the RequiresSession property of the RODataSnapModule. Now the only problem with this is during design time, once it is on you will not be allowed to connect to your DataSnap datasetproviders without first logging in. So, as you can read that is a problem.

The solution? We added a command line parameter to our application server that simply controls if we want to require a session or not in our services or datamodules. Once that flag is set, we just use it as an indication while we are creating the service/datamodule.

Soon, I will post the results of a LETHAL combination, KbmMW, RO and DataSnap in a single SUPER POWERFUL MIDDLETIER MILKSHAKE.

Tuesday, July 31, 2007

Developer Day en Español!!

Hi guys,

CodeGear is hosting "Developer Days en Español" a nice gathering of great developers with the spanish speaking Delphi community.

There is a wide variety of sessions including Delphi, Interbase, PHP and Java.

It is only for 2 days, today was the first session and it was great. An average of 300+ guys were listening online to the different sessions, in my opinion a great success.

So if you want to check it out, this is the website.

Enjoy!

Monday, July 30, 2007

A nice surprise on Delpih 2007...

I was creating a "Lost Connection" dialog box for our rich client applications and as a distraction I was browsing around through thousands of nice icons on the internet, when I finally selected the icon that i wanted, I realized that it was (as almost every nice icon now days) a .png file.

Big disappointment I said, cause my mighty Delphi doesn't support that in a native way, I need to use a open source component, (actually there are tons of them, but I don't have the time for that) so I will have to convert it to that ugly bitmap thing. Oh well, i did and when I was getting ready to upload the image, sweet surprise!

TImage supports now gifs, cur, pcx, ani, png, gif, jpg, jpeg, bmp, ico, emf, wmf. I don't know if that is something new, but for sure it wasn't on Delphi 7 :P.


Love my D2007.

Sunday, July 22, 2007

Visual Studio launch in 2008....

Latest news said that Visual Studio will see the light in a joint launch with MSSQL Server 2008 and Windows Server 2008 in Los Angeles on February 27th, 2008.

But, it will be released officially by the end of this year.

CodeGear Studio will be out around September.

Sunday, July 08, 2007

Database scalability with MSSQL 2005

Some news from the battlefront...

We are expecting at least a 75% increase of our internet traffic, db transactions, network and database activity in general so our entire company is working hard on getting ready to manage this incoming wave.

Regrettably, life is not always good and the core of the business is setup around a system using a typical client/server architecture where each workstation creates a connection to the database and to make it worst, establishes server side cursors to fetch all data they need.

Now on the other hand, the website is driven by old school ASP pages, using adHoc Queries or direct stored procedure calls to the already beat up database. Finally, it is important to mention that this is running on ODBC connections.

End result?, chaos, in high traffic moments the entire network slows down and all the applications switch to a slow silent death walk to a total IT crisis. There is no caching techniques, no business layer, basically no middle tier... all the opposite things to what I'm used to deal with, and that is why it is my mission to turn this around and take over the world. (moooooahahaha)

Well, the mission is simple, we can't change much in short time, so we need to make this system work as good as possible. So, my first task is to provide a list of recommendations towards the deployment of our new database servers running MSSQL 2005.

Yup, we are migrating from MSSQL 2000 to MSSQL 2005, people will think that it is a simply straight import/export, but special considerations need to be made when dealing with the databases collation which has completly change from one version to the other one and to the famous "compatibility modes". Keeping an imported database in Compatibility mode 80 (MSSQL 2000) may save you headaches but will offer you little in terms of all the good new toys and performance tricks that you can apply on compatibility mode 90 (MSSQL 2005).

Bellow is a summary of my initial recommendations aimed to the physical database storage design aspect that you should follow when dealing with a situation like the described above:

  1. Server side cursors implies lots of memory usage on the database. So, increase memory on the database server, the more the better. Make sure your database is using it, by checking the database memory limit. (special attention to the 3gb limit on the non Enterprise versions of MSSQL)
  2. Multiple core machine? well, you will not use them if you don't separate that big nasty .mdf file into multiple .ndfs, what is the recipe? the number of data files within a single file group should equal the number of CPU cores.
  3. For optimized I/O parallelism, use 64 Kb or 25 Kb strip size when defining the RAID configuration.
  4. Use manual file growth database options. Automatic is only for development (ja! you didn't knowt that did ya?)
  5. Increased size of “tempdb” and monitor space usage, adjusting accordingly. The recommended level of available space is 20%. All your temp tables and indexes are created there so, keep that guy with enough space, if you are using the default size that is only 8MB, and that is basically 8 floppies, so be nice, put some more space there.
Alright, so that should buy us some time to concentrate on the next stage, database optimizations. Until the next post.

A painless self-hosted Git service

Remember how a part of my NAS setup was to host my own Git server? Well that forced me to review options and I stumble into Gitea .  A extr...