7:41 PM - Overwatch firewall config
I managed to get it running a little better why tweaking the pfsense configuration with the list of ports
https://us.battle.net/support/en/article/300479
location: Home
I managed to get it running a little better why tweaking the pfsense configuration with the list of ports
https://us.battle.net/support/en/article/300479
location: Home
I've had a lot of lag problems with Overwatch. It's quite often that I will think I killed someone only to have the game's lag algorithm "correct" their movement in such a way that I don't even have the person in the same room with me.
I've been trying to figure out if there is anything about my network setup making it worse. I suspect others share in my frustration with the game. It's quite fun, but I'm not sure it's worth the headache.
location: Home
Just released a new version of my blogging site. It's the first serious update in a few years. I'm still migrating it to Spring Boot, but so far it's going pretty well.
I've finally got it running without the need for root, as a self contained jar file, and moved to thymeleaf for templates.
It was the project that I learned Java on back in 2003. It's moved from CVS to Git and Sourceforge to Github. It's transitioned from Servlets and JSP to Maverick with XSLT templates and now to Spring 4/ Spring Boot.
location: Home
I've been working on migrating foolishgames.com over to PostgreSQL from MySQL. I've been using MySQL since I worked at USOL.com ~2000. While I still appreciate the simplicity of administration and use of MySQL, several things have bugged me about it's direction since Oracle bought it.
First, the shared library requires threads in C now. This means that half the time I can't get the Perl bindings to work and it's often a pain with my C programs too.
I feel that they're only take it so far so it doesn't compete with Oracle. PostgreSQL on the other hand, has been adding all sorts of neat features like better replication, JSON data types, and rewriting how the engine allocates memory. There's really movement in that camp.
Working on Just Journal 2.0
Trying to build QT5 for the hell of it. It's pretty crazy how their build system works. I think it thinks it's running on Linux. Ugh.
Decided
to try out the new .NET 4.5 JJ client. Still needs some work.
location: Home
So
far I'm loving Windows 8. My first impressions of the consumer
preview were not that high, but it's surprisingly polished for a
new windows release.
location: Home
mood: Happy
The new MySQL blows.
Caryn's going on a business trip on Sunday. I'm trying to get some paperwork straightened out with my employer. Then there's the painters and getting the house ready for the next set of rooms. Tigress is doing better, but I'll be feeding her solo next week. So much going on.
Many people remember Steve Jobs as a visionary, the driving force of Apple, Inc.'s success in the last decade. He was also the owner of Pixar that transformed a small animation studio into a blockbuster success and sold it to Disney. He sat on the boards of Apple, Inc. and Disney. He had successes and failures. NeXT Computer was a computer manufacturer that made workstations for schools and businesses in the early 90s. They made some of the first computers with decent graphical user interfaces, networking, and MACH kernels. Steve sold NeXT Software (the hardware business failed) to Apple around 1997 and became the head of a company he founded once more.
The world wide web was created on a NeXT computer. The first website, web browser and web server all ran on a NeXT cube! Steve brought us the iMac, Mac OS X, iPad, Iphone, iTunes (well they bought this from a former apple employee), and the reinvention of how users consume content. Good or bad, this has affected all of us.
Steve didn't do these things alone. Many other talented people helped him. He sold the ideas to all of us.
I started my BSD project because of Steve Jobs. NeXT (and OS X) was an idea that computers could be powerful, stable and easy to use. The last six years of my life, I've spent trying to build something like OS X but for people who couldn't afford the Apple preimum. As I've learned, he had to charge that much to be successful.
Just finished mounting the other hard drive in ds9. Next time I build a rack mount server, I need to remember to buy hot swap bays. This project took 5 hours and it should have been quick.
I always here that one of the advantages of Linux over BSD is the hardware support. My new laptop has proven to be a problem on that front. Ubuntu installed from Windows worked somewhat ok, so I tried to install it via burned ISO. It randomly crashed during install and never would setup grub for booting. I had deleted windows, but installed BSD at the beginning of the disk.
After trying 6 times to install ubuntu, I decided to try debian. Unlike ubuntu, debian has an older kernel (2.6.32). This is older than the magic 2.6.38 where AMD added graphics support. I thought I'd be clever and go to sid which has a 3.0 kernel. Often, the screen goes totally black during boot. No virtual terminals work and gdm3 won't startup either. I can't even get into single user mode without black screens after a few seconds of booting.
2.6.32-5 will boot and work, but I don't get battery or cpu frequency information and the amd graphics driver does not work well with it. It runs, but not much acceleration. Not knowing how much battery life is left on a laptop is a big problem. It runs like it's on AC!
I also can't dim the display because i have to use a hack to work around the broken acpi video so that the backlight won't be dimmed AND the keys don't work to change brightness either.
Sound also doesn't work in 2.6.32. It did in ubuntu.
Conversely, MidnightBSD does not have working wireless and i have not tried sound. The onboard atheros nic doesn't work well in either OS, but it's more stable on BSD. I have to run 0.4-CURRENT for that. I don't have the dim problem with the backlight unless i load the acpi_video module on MidnightBSD and that's not default. There is no binary amd graphics driver so I can't go with that forever.
I sent AMD an email today asking about Linux drivers for the new laptop that I ordered. I was told in this email that AMD does not support laptops with ANY drivers and that is up to the OEM. Further, they said they don't support Linux even though they offer Linux binary drivers.
How does that work?
I just purchased a laptop last evening. It's been harder to do this time than any point since I got my first computer. There are so many new technologies out that it's a lot to shuffle through. Consider that in the old days, you could look at a few specs and know one computer was better than another. Sure there were quality differences and software packages, but the hardware was easy to figure out.
Between the Intel 486 and the Pentium 4, one could look at the frequency (Mhz or Ghz) number for a rough idea that one processor was faster than another. Then around 2006, they started shipping multicore CPUs. That makes things a lot more complicated. Most people didn't know what a core was. Computer geeks even thought about SMP (symmentric multi-processing) or multiple processors in a computer, not cores. Without getting too crazy, a core is like a brain in the processor. A multicore CPU means it's got more than one brain. The computer can think about multiple problems at the same time. It can do two different tasks at once like play a game and record a movie.
To make matters more confusing, a multicore CPU doesn't mean that it's twice as fast as a single core cpu (old ones). Two nearly identical CPUs, one with 2 cores and another with 1 will not mean the 2 core is twice as fast. There's a math formula to figure out the actual best case performance, but I'll spare you that. Worse yet, if you don't run two programs at once or you don't use a multithreaded program (one program that can do more than one thing at once), you don't get a lot of use out of a multicore CPU. Windows, Mac OS and Linux can use them for their own work.
Some consumers figured out sort of what a multicore CPU was. Intel ran all those fun ads about multiplicity and what not. Then they made major improvements in chip performance, yet with lower frequency (mhz again). So the core 2 duo cpu (a confusing name because the 2 is not the number of cores, but the generation) seemed slower by numbers, but it was faster than the pentium D it replaced (multicore stuff).
So consumers couldn't trust numbers anymore. How to tell what is faster? Intel had this great idea to give them numbers. Any number within the same range would mean a chip is faster than the next. That led to other problems. A 350 might be faster than a 610. That's not intuitive. To make matters worse, Intel would sell chips to computer companies with some features missing.
As a consumer, I have to search intel's website to find out if all the features are then when looking for a computer. Many of them have weird names like hyperthreading or VT or execute disable bit. Do most people even need these things? Maybe. Hyperthreading is a hack intel came up with to trick a processor into thinking it's got 2 brains (cores) when it has 1. This means two programs can run at the same time, but slower than one program if you didn't have that feature. VT is for virtualization. If you buy a highend version of windows and want to use the XP compatibility mode, you need this. Otherwise, it's only good for IT people. Finally, execute disable bit is always a yes. It's a security feature that stops some viruses and other bad programs from working.
So now the computer industry has found a way to make things even more complicated. There is new technology where they combine a graphics card (what makes the picture on the screen) and a processor together. This is a great thing for people who don't play games. It means your laptop will have better battery life. The graphics power in these things is very low compared to discrete graphics (separate video cards) and so they're terrible for WoW, starcraft2 or portal 2. They can run these games, but not fast. The other problem with these integrated chips is that they are usually slower than chips without this feature (especially on the AMD side). AMD has decided that graphics power is more important than CPU power because many people just watch movies or whatever and don't need CPU power. Intel did the opposite and made the graphics just barely enough to watch the latest high def video, but fast CPUs. Intel calls their CPUs with this feature Sandybridge (the codename of the chip/core) and AMD calls theirs fusion with 3 series.. A, E, C (fast to slow)
So when buying a new computer, realize that every small laptop under 14 inches probably has one of these new chips in there. It's going to be not much faster than a 2 year old computer for CPU power. If you buy a 15-17 inch laptop and it's intel, you will probably get a core i3 or core i5 CPU with this feature now. If it's an AMD, you may get it (A series) or a phenom II CPU without it. The chips with it might be as slow as 1Ghz. The type of chip matters too.. i3 is <= i5 <= i7 for intel and c < e < a (no overlap)
The other big thing to look out for are solid state drives (SSD). This is a replacement for hard drives which is what your data is stored on (windows, your files, games, etc). Hard drives use magnets and spinning disks to store information. There are moving parts. You have to wait for the litle read head to get back to where it needs to be (sort of like a cassette tape, but faster) before it reads data. SSD is faster for reading information (usually) because it can go directly to the place something is rather than having the moving parts. It's also said to be more reliable because there are no moving parts. However, I've seen several go bad so ignore those claims. Solder can go and they can only be written to a fixed number of times in one spot. They do wear out. They are expensive and smaller than hard drives. If you don't need speed and you have a lot of movies, music, games, etc stick with hard drives. Eventually SSD will be better, but it's still fairly new technology.
dbaspot.com/ingres-database/191160-there-equivavlent-auto_increment-ingres.html
This is a mailing list conversation between several users of the Ingres database about why auto increment primary keys are bad. I disagree with them for many reasons.
Just finished watching Detonator. I haven't seen it in years. Patric Stewart plays a U.N. team lead that is tasked wtih stopping two bombs from detonating. It's a rather funny film. Ted Levine is also in it, but he looks quite a bit younger compared to Monk. Of course, it's ten years before Monk too.
Caryn decided to take a nap rather than watch the film. LOL.
Crazy. I had a 770GB log file on a server today. It actually filled up the file system. Very messy.
Caryn and I went to the U2 concert yesterday at MSU. It was a great show, but it's obvious they're getting tired of playing the songs from the latest album. They just flew back from a music festival overseas and I suspect they were tired. Several of the songs were played rather quickly.
They still sounded awesome though.
The concert ended around 11PM, but it took us until 2AM to get home. Rough night. We also had some weird issues with the venue. They didn't allow purses so we had to walk clear to the other side of the campus to take Caryn's purse back to the Car. I was very tired just from walking. Not used to pesky exercise.
I just got the RAM for the servers. Had a few glitches. ds9 was a breeze. I just popped it in and turned it back on.. 12GB RAM. Stargazer, on the other hand, won't boot with the original HP memory chip + the new kingston ram. I had to live with 8GB of RAM in that one :)
Caryn's new RAM came too. We're just too tired to pop it in right now.