Hi Everyone! I haven’t written a post at my blog site in a few months, and the first post I decided to write in a long time involves my thoughts on Politics and Christianity. Run for the hills!!! No… wait. Just kidding! In all honesty, I wanted to write about this since Read More
I am happy to announce that my secure online digital store, Valor Apps, is now up and running!
I will be using my new site for purchasing of all my commercial Joomla extensions! The site is run under my business name, Valor Apps. You’ll find more information on all my software. I will be updating the site in the next month or two, adding more documentation for your favorite software.
This blog site will still be used for my personal affairs, and also to promote my software in a general sense. My ultimate aim is to separate my software business from my personal blog. One of the major benefits of this is that I no longer will be sending download emails, but rather you can log into the Valor Apps site and download any software your purchase.
Thanks for your continued support, everyone!
I am just sitting down here at my desk, and looking through some korean web toons at http://comic.naver.com. I actually like a lot of these web toons. There are fans of these web toons who translate them into English for others to read and enjoy. I read a number of them faithfully each week. I just wish I could read Korean myself. Read More
What if you wanted to call an unknown method that belonged to an unknown object? Or even multiple unknown methods from multiple unknown objects?
Huh? What am I talking about? You don’t think you would want to do that? Well, guess what? I wanted to do that just this past week! Yep! I wanted to call an unknown method (or methods) of an unknown object (or bunch of unrelated objects).
What’s that? Why would I want to do that? Well, have you ever used a UIControl, such as a UIButton, and you wanted to make the click event trigger a particular method in one of your project class objects? How would you implement that? Well, it’s quite simple, you just do the following:Read More
(I started to write this on father’s day. However, I only finished it on the following Saturday, 25th.)
Happy Father’s Day to all you fathers out there!
I know the day is almost done, but this the first time for the day that I have been able to be on my blog! My daughter, wife, parents and other relatives and friends wished me a happy father’s day today! It was a good day… a busy day… but, a good day!
Today, I gave a short sermon at church today for Father’s Day. I wanted to share a summary of it with you all. I hope it would help to encourage all you fathers out there to strive to be the best dad you can be. My message was based on an acrostic of the word FATHER that I devised:Read More
IntroductionThis is my first tutorial, and I am proud to make it be about writing Objective C code for the iOS and Cocos2D. Since the debut of the iPad, there has been much talk on the Cocos2D forums about how to code one Universal app that will work on the iPhone/iPod touch (resolution: 320×480), the iPhone4 Retina Display (resolution: 640×960), and iPad (resolution: 1024×768). As it stands, Cocos2D 0.99.5 to 1.0-rc accommodates for iPhone and Retina display resolutions automatically by appending the filename with ‘-hd’. So, if you have a background file named, background.png that works for 320×480 resolutions, all you have to do is create a larger version of the file named, background-hd.png. See here. The second issue of how Cocos2D accommodates for iPhone and Retina display is in positioning. Since version 0.99.5, Cocos2D automatically does this using points instead of pixels. So, in the normal display, the normal display, setting the position to ccp(100, 100) is automatically translated to ccp(200, 200) in the Retina display. The issue remains regarding how to accommodate specifying the images and the correct positioning on the iPad. In my research on how to do this, I came across the following discussions on the Cocos2D forums:
- iPad2 + Retina + Universal
- one app for all (iPhone & iPad), iPad image issue
- How can iPad game use -hd images ?
- Use Retina Graphics In iPad
- Running iPhone game on iPad with hd images ? Ok with the guidelines ?
- HD Graphics in iPad x2 Mode
Code SetupFirst I created a header file to maintain my constants, such as file names. This header file was called Constants.h. I also created another header file that contain macros to detect and handle the different device details. I called this file DeviceSettings.h. [sourcecode language=”objc” title=”DeviceSettings.h”] #import <UIKit/UIDevice.h> /* DETERMINE THE DEVICE USED */ #ifdef UI_USER_INTERFACE_IDIOM() #define IS_IPAD() (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad) #else #define IS_IPAD() (NO) #endif /* NORMAL DETAILS */ #define kScreenHeight 480 #define kScreenWidth 320 /* OFFSETS TO ACCOMMODATE IPAD */ #define kXoffsetiPad 64 #define kYoffsetiPad 32 #define SD_PNG @".png" #define HD_PNG @"-hd.png" #define ADJUST_CCP(__p__) \ (IS_IPAD() == YES ? \ ccp( ( __p__.x * 2 ) + kXoffsetiPad, ( __p__.y * 2 ) + kYoffsetiPad ) : \ __p__) #define REVERSE_CCP(__p__) \ (IS_IPAD() == YES ? \ ccp( ( __p__.x – kXoffsetiPad ) / 2, ( __p__.y – kYoffsetiPad ) / 2 ) : \ __p__) #define ADJUST_XY(__x__, __y__) \ (IS_IPAD() == YES ? \ ccp( ( __x__ * 2 ) + kXoffsetiPad, ( __y__ * 2 ) + kYoffsetiPad ) : \ ccp(__x__, __y__)) #define ADJUST_X(__x__) \ (IS_IPAD() == YES ? \ ( __x__ * 2 ) + kXoffsetiPad : \ __x__) #define ADJUST_Y(__y__) \ (IS_IPAD() == YES ? \ ( __y__ * 2 ) + kYoffsetiPad : \ __y__) #define HD_PIXELS(__pixels__) \ (IS_IPAD() == YES ? \ ( __pixels__ * 2 ) : \ __pixels__) #define HD_TEXT(__size__) \ (IS_IPAD() == YES ? \ ( __size__ * 1.5 ) : \ __size__) #define SD_OR_HD(__filename__) \ (IS_IPAD() == YES ? \ [__filename__ stringByReplacingOccurrencesOfString:SD_PNG withString:HD_PNG] : \ __filename__) [/sourcecode] The idea is to have a set of macros that handle the checking for iPad and substitute the correct filename and coordinates (points/pixels) on the screen. Here is the constants file: [sourcecode language=”objc” title=”Constants.h”] /* TEXTURE FILES */ #define kSpriteTexture1 SD_OR_HD(@"GoodGuy.png") #define kSpriteTexture2 SD_OR_HD(@"StageBoss.png") /* FIXED POSITIONS */ #define kSomePosition ADJUST_CCP( ccp(200, 100) ) [/sourcecode]
TexturesSo, from looking at the Constant.h file, you can see that we just need to define the texture in the SD_OR_HD() macro, and if it is the iPad, you will have GoodGuy-hd.png. If it is the iPhone/iPod touch, Cocos2d will load GoodGuy.png, and if it is iPhone4 Retina Display or iPad, Cocos2D will load GoodGuy-hd.png. [sourcecode language=”objc” title=”SomeCodeFile.m”] CCSprite *goodGuy = [CCSprite spriteWithTexture:[[CCTextureCache sharedTextureCache] addImage:kSpriteTexture1]]; goodGuy.position = kSomePosition; [self addChild:goodGuy]; CCSprite *stageBoss = [CCSprite spriteWithTexture:[[CCTextureCache sharedTextureCache] addImage:kSpriteTexture2]]; [stageBoss setPosition: ccp(0, ADJUST_Y( kScreenHeight*0.45 ))]; [self addChild:stageBoss]; [/sourcecode] For the positioning on the goodGuy sprite, you will have the following result coming from using:
- iPhone/iPod touch: ccp (200, 100)
- iPhone4 Retina Display: ccp(400, 200)
- iPad: ccp(464, 232)
LabelsNow, in the DeviceSettings.h file, I also have HD_PIXELS and HD_TEXT. I use these to adjust changes to coordinate and to the size of text used in Cocos2D, such as labels and menu items. So, for example, to move a sprite down by 60 pixels on iPhone and (the equivalent) 120 pixels on the iPad and Retina Display, I would type: [sourcecode language=”objc” title=”Example 1=using 2=code 3=in 4=a 5=class”] id action = [CCMoveBy actionWithDuration:1.0f position:ccp(0, HD_PIXELS( -60.0f ))]; [goodGuy runAction:action]; [/sourcecode] As for the font sizes, you can use HD_PIXELS, which is the most accurate method, but I often find the text to seem too big, so I use HD_TEXT instead, which multiplies the font size by 1.5, instead of 2.0, as with HD_PIXELS.
Spritesheets and BMFontsThis concept can also be applied to loading Spritesheets and BMFonts. You can make Spritesheets with Texture Packer, and you can make BMFonts with Glyph Designer. Basically, add the following to the DeviceSettings.h file: [sourcecode language=”objc” title=”DeviceSettings.h”] /* SD/HD Font file */ #define SD_FNT @".fnt" #define HD_FNT @"-hd.fnt" /* SD/HD Spritesheet plist */ #define SD_PLIST @".plist" #define HD_PLIST @"-hd.plist" #define SD_HD_FONT(__filename__) \ (IS_IPAD() == YES ? \ [__filename__ stringByReplacingOccurrencesOfString:SD_FNT withString:HD_FNT] : \ __filename__) #define SD_HD_PLIST(__filename__) \ (IS_IPAD() == YES ? \ [__filename__ stringByReplacingOccurrencesOfString:SD_PLIST withString:HD_PLIST] : \ __filename__) [/sourcecode] Using this code is simple. Here’s the code: [sourcecode] /* BMFont example */ NSString *string1 = NSLocalizedString(@"This works!", @""); CCLabelBMFont *notice1 = [CCLabelBMFont labelWithString:string1 fntFile:SD_HD_FONT(@"myFontFile.fnt")]; notice1.position = ADJUST_XY( kScreenWidth*0.7f , kScreenHeight*0.4f ); /* Spritesheet example */ //1. (Pre)Load spritesheet [[CCSpriteFrameCache sharedSpriteFrameCache] addSpriteFramesWithFile:SD_HD_PLIST(@"mySpritesheet.plist")]; //2. Get the image inside the spritesheet as a CCSprite // Note: mySpriteImage.png is in mySpritesheet.plist and mySpritesheet.png CCSprite *mySprite = [CCSprite spriteWithSpriteFrameName:@"mySpriteImage.png"]; [/sourcecode] This ends the tutorial. I hope my code is self explanatory.
- Compared to humans, the data that supercomputers have access to was all given to them. It was not learned by them through any form of sensory input. The data that they have was provided to them probably through searching and indexing terabytes of factual data, and storing it in a fault-tolerant database system. To me, that is like competing against a textbook. The human mind is multi-dimensional, from the point of view that all the data and information we learn and store in out minds/brains was as a result of taking in every sound, every visual image, and making sense out of it, and then remembering that association in the brain. To me, that is not the same as searching through a database of organized information that was basically sorted and given to you, and specifically designed to work in a game called Jeopardy. I mean, any human that will ever go against a supercomputer like Watson was processing every visual, auditory, and tactile input at the same time as trying to understand, process, and answer a question in Jeopardy. The processing power of our brains are always divided. The brain is processing bodily functions (like controlling hormones) and input (like deciding to scratch an itch in your sleeve, or wishing the air conditioning wasn’t so cold in the Jeopardy studio). The human mind may be wondering any moment whether the stove was left on, or if your wife is watching you on TV, or something, and still focuses on answering Jeopardy questions. Supercomputers have no idle thought, and no processing of physical functions. All they do is process that data they are designed to process. For me, the day that I admit to a supercomputer being intelligent is the day that the supercomputer can learn and process data the same way I can, which involves visual and audible input of your surroundings.
- Secondly, what I find intelligent are the people who write and develop the custom algorithms that process the speech input and the data that Watson has stored. Did Watson create or develop any of those algorithms for itself? Did it figure out how to learn a language? Or what words mean in different language constructs by itself, or intuitively? Someone, or rather teams of persons, had to figure out how to write the instructions to make Watson do it. So, in actual fact, the intelligent ones are the people who created all these algorithms, and figured out how to get it all to work together. Watson is the culmination of all of their intelligence. Personally, I think that it doesn’t do them justice to call the supercomputer intelligent and not recognize the intelligence of these individuals. I mean, from the time we are born, it takes us years before we fully understand the complexes and nuances of a language, and no one puts instructions into us as to how to do it. We, starting from nothing but being able to take in audible data, learn to understand language and the meaning of the sounds we hear. As of yet, I don’t know of any supercomputer that can do that. For me, I was more interested in who were the people who made Watson possible. They are pretty darn awesome!