DPChallenge: A Digital Photography Contest You are not logged in. (log in or register
 

DPChallenge Forums >> Hardware and Software >> Photography Software: I Wrote Some
Pages:  
Showing posts 26 - 50 of 53, (reverse)
AuthorThread
11/18/2010 08:26:53 AM · #26
By the way, the software looks nice. If I had an iPad and were shooting manually on my camera, I'd buy a copy :)
11/18/2010 08:53:35 AM · #27
I just bought the payware version. I'll let you know how it works out for me. I only shoot manually BTW.
11/18/2010 05:49:23 PM · #28
Originally posted by wiesener:

Originally posted by Mousie:

Originally posted by wiesener:

Some things are only possible (or at least much easier) to do on the Java-side, such as audio playback. Other things make more sense to do on the "native" (C/C++) side, such as manipulation of large arrays, tight loops with lots of calculation etc. Some things will even make sense to implement in Assembly, such as operations that can benefit from the NEON extensions on some of the ARM processors.


That's why I say that Android is in Java... you need to use the highest level language to get the full suite of functionality.

It's just like Obj-C in iOS... you can write a lot of code in raw C or even lower level languages, but to access some of their higher-level APIs you MUST use Obj-C... they only make them available in that syntax.

For example, all my mathematically intense, OpenGL ES-based particle system code in my iOS app PixelSwarm is written in pure C using simple structs, inline methods, and raw memory buffers... NOT objects that need to be constantly allocated and destroyed or that send messages to each other to interpret and execute. Because, you know, it needs to be snappy! The UI itself is all in Obj-C, however.

Even so, I'd consistently say that you develop for the iPhone in Obj-C, regardless of how I've personally optimized certain sections of my own logic.

And Assembler? Why not just say that's how you can program everything, and call it a day? Why bother calling it out? You're being a bit pedantic, here. ;)

P.S. You forgot Ruby and Python, both of which can be used to implement Android apps. (JRuby/JPython)

And web apps using JavaScript and DHTML techniques.

And Flash.

And you could write an online back-end that drives major parts (or all) of your app's behavior in any language supported by your server, using the device as a thin client.

See? I can be pedantic too. But if I were to send someone off to learn Android development, I'd tell them to go lean Java, not the rest. Which is what I did. :)


If I were to send someone off to learn Android development I'd tell them to start off with Java as well, but I would still let them know that there is more to the equation. By comparing iOS as being objective C and C, with Android as being only Java, you are implying that Android is a less advanced platform than iOS, which obviously isn't true. I just wanted to correct that impression.

As for Assembly, I specifically call it out because you need it (or some similarly archaic compiler intrinsics) if you want to benefit from the CPU extensions available in the newest ARM NEON-based processors (That goes for iOS as well, by the way). Think of it like MMX/3DNOW/SSE2-extensions for desktop CPUs. Using those extensions you can do stuff like multiply and add on 16 variables simultaneously, which will obviously give a tremendous speed increase for software where those sorts of operations are relevant - video playback being a common example.

Of course you are right that there are a bunch of other higher-level languages that can be used as well, specifically anything that compiles to Java bytecode, and I guess I should have mentioned those for completeness. My goal, however, was simply to inform people that low-level (and performance-critical) development is actually possible on Android platforms as well. There seems to be a general misconception that Android is slow and sluggish ("cause it's Java-based, you know"), and I don't want people to perpetuate that myth. Is that pedantic of me? Maybe.


Ahh... I get it! You feel Java has some issues, and that when comparing it to Obj-C it's inferior somehow, or that it reflects poorly on Andriod development. I don't really see it that way. The days of Java being slow are long, long gone. I write enterprise storage administration software in Java (and Java converted to JavaScript via GWT (the horror!))... if anything I'd suggest that Java has significant advantages over Obj-C for the overwhelming majority of app development, which is mainly consumer-targeted GUIs in these mobile contexts.

The Obj-C syntax is insanely heavyweight, often requiring you to very precisely define things twice in a header AND again in the source. Worse, iOS flavored Obj-C does not even have true garbage collection! It's cute, powerful, and a pleasure to write in... if you can pay attention to detail... but Java is a much more robust production language, IMO. I'm not sure if you've used Obj-C, but how tiresome is always typing something like:

[someObject aSignatureThatTakes: foo andAlsoTakes: bar andAlsoHasTheParam: cat andFinally: dog];

...instead of:

someObject.aSignatureThatTakes(foo, bar, cat, dog);

It's kind of absurd. :)

Java may have a bad reputation, but I feel it's undeserved at this point. I find that most of the time it feels just like C (with an object syntax) but with less manual maintenance and better libraries/toolkits. I think Java is the most important feature of Android, a really smart choice on their part, and is going to lead to it's eventual adoption in many more places than you currently see iOS. Obj-C is a hurdle to adoption in a way that Java is not.

But that's just my take on things!

(Sorry to all the non-techies out there, we programmers tend to care about these details a lot, despite how boring they look in print!)
11/18/2010 05:50:40 PM · #29
Originally posted by TrollMan:

I just bought the payware version. I'll let you know how it works out for me. I only shoot manually BTW.


I'd love to hear your feedback. Unfortunately the new release does not ship until Apple drops iOS 4.2, but that should happen sometime this month, and you'll get the update for free of course. :)
11/18/2010 05:51:25 PM · #30
Originally posted by wiesener:

By the way, the software looks nice. If I had an iPad and were shooting manually on my camera, I'd buy a copy :)


Thanks! I put way too much work into something that basically slides pictures around... but that's how I roll! :)
11/18/2010 09:20:46 PM · #31
Just out of curiosity: how would you compare the accuracy and workflow of (reflected light) spot metering in aperture priority mode vs. incident light meters in manual mode?
11/18/2010 10:13:02 PM · #32
Wow.. That's a lot of text to read through.
Unfortunately I'm at work and don't really have much time to crunch through it all.
At fist appearance (which means by looking at the screen dumps), it looks great.

I just got an ipad yesterday (my very first closed-source piece of equipment since 1998....)
I'll certainly look for this and buy it.. If anything to support the course. :)

I'm certain the new release of the ipad is just around the corner, and with it, might have a camera integrated which could make it rather useful if you could somehow pull the data from its meter....

Do you have further ideas for version 2.0?
11/19/2010 12:10:39 AM · #33
Originally posted by mitalapo:

Just out of curiosity: how would you compare the accuracy and workflow of (reflected light) spot metering in aperture priority mode vs. incident light meters in manual mode?


They are two different approaches for two different purposes, really.

When you use the camera's metering, in a spot mode, you need to evaluate what is actually being metered and adjust the recommended exposure + or - to compensate for the camera's Zone 5 placement of whatever it is metering. Accordingly, it's best to use the camera in manual mode even when using the camera's meter, if accuracy of exposure is paramount. You CAN use EV compensation, but I don't see the point of it in those circumstances, it gets a little confusing. So, working in manual, you meter the snow in your winter scene, see that the camera "recommends" f/16 at 1/250 for an exposure, mentally say to yourself "I want Zone 6.5 snow, not Zone 5!" and adjust the exposure f/11.5 at 1/125, adding a stop and a half. Or, in AV, EV compensation of +1.5 will do it.

When you use an incident meter, everything that is lit by the same light that is falling on the meter's hemisphere will be correctly exposed at the recommended exposure. You just dial it in and forget about it until the light changes. That's the theory, anyway. And it works very well in the normal run of things. But where it DOESN'T work real well is where there are extremes of bright or dark areas that you need to render full detail in. Your incident-light metering of your snowy scene will very likely blow out all detail in the snow, which (if measured by reflected light) might well fall up on Zone 8 or even Zone 9, well outside the range where detail can be rendered.

So it's really 6 of one and half a dozen of the other.

When you're shooting in rapidly changing light conditions/scenes, camera metering tends to be the way to go. When you're working in steady light but the backgrounds are constantly changing (like sporting events, wedding receptions outdoors, whatever) then incident-light metering, "set it and forget it" is a very reliable way to work. You don't have to worry if the backdrop for your people is dark or light, and worry that your meter may be compensating for it.

Really, as you gain experience as a photographer, you will gain a more-or-less intuitive ability to *sense* the correct exposure, and you will start working in manual, with pre-set exposures, maybe 75% of the time, if not more. Back when I was working as an architectural photographer, even when shooting interiors we rarely used our light meters, except to use an extreme spot meter to take precise readings of potential trouble areas. We just KNEW what the exposures would be.

It's that sort of shooting that Mousie's app is bridging to, as he's pointed out earlier. It's very liberating to stop fiddling with dials and checking histograms, to just BE the camera and work as one with it.

R.
11/19/2010 12:50:51 AM · #34
Originally posted by Mousie:

Originally posted by TrollMan:

I just bought the payware version. I'll let you know how it works out for me. I only shoot manually BTW.


I'd love to hear your feedback. Unfortunately the new release does not ship until Apple drops iOS 4.2, but that should happen sometime this month, and you'll get the update for free of course. :)

I'm an apple developer myself so I've had 4.2 for quite a while. But I believe the official release of iTunes to support it came out yesterday. At least her in Norway. So not long now :)
11/19/2010 12:00:37 PM · #35
Originally posted by mitalapo:

Just out of curiosity: how would you compare the accuracy and workflow of (reflected light) spot metering in aperture priority mode vs. incident light meters in manual mode?


Here's the big deal for me: consistency.

When shooting manual under consistent lighting conditions (indoors, open sky, artificial light, etc.) you know your exposure will not change. If you underexpose one image by 1/3 stop, you've underexposed all your images by 1/3 stop. That's an easy, one-step fix in Lightroom. It doesn't matter if you point the camera's meter at a brighter part of the scene, or right into some shadows (two things that will cause any priority mode to produce different results)... you will get the same exposure... and that makes batch processing your images a lot faster.

Even when the light is variable (say, a windy, cloudy day) as long as I know how it's varying, I just set my camera to cope... one custom setting for open sky, one for mixed, and one for clouds, depending on the time of day. This works pretty well once you have practice, another common thing I do is use one custom setting for sun, one for shade, for when I'm in the woods or shooting flowers in a park... the light's not changing, but the scenes are.

I also do a lot of off-camera flash, and that means balancing an exposure to the flash, not the ambient light (which I usually drive down a stop or two). Flash is not available to your camera's incident meter for you to read an exposure... priority modes are a no-go. I usually underexpose for ambient, then back-fill by adjusting my flashes' output or their distance from the subject, then take a reading of the flash with an incident light meter if I want to get super precise. I don't even have a non-manual workflow for this. :)

In general, I find that manual metering produces images that look a tad darker than what the camera would produce itself, preserving a lot more color saturation and detail in the highlights. This works a lot better for me, because I like to selectively tweak my exposures in Lightroom, it's a much better place to start from. It lets me brighten areas I want to draw attention to without sacrificing the quality of the image, and I don't have to play with the color as much, or at all!

I was actually kind of shocked at the difference this extra accuracy makes, it has pretty much soured me on reflected light metering, chimping, and using histograms. Histograms can mislead you badly, even though they'll usually get you somewhat close. Bust out the incident meter (or make a good guess/remind myself with a calculator) and BOOM my images suddenly look rich. Lately, I've been able to do this even without a meter or calculator, after much training and practice. I have my go-to settings.
11/19/2010 12:03:44 PM · #36
Originally posted by Bear_Music:

It's that sort of shooting that Mousie's app is bridging to, as he's pointed out earlier. It's very liberating to stop fiddling with dials and checking histograms, to just BE the camera and work as one with it.

R.


This this this.

I shoot manual so I can be the camera, not be the guy second-guessing the camera! :)
11/19/2010 12:12:38 PM · #37
Originally posted by pitrpan:

Wow.. That's a lot of text to read through.
Unfortunately I'm at work and don't really have much time to crunch through it all.
At fist appearance (which means by looking at the screen dumps), it looks great.

I just got an ipad yesterday (my very first closed-source piece of equipment since 1998....)
I'll certainly look for this and buy it.. If anything to support the course. :)

I'm certain the new release of the ipad is just around the corner, and with it, might have a camera integrated which could make it rather useful if you could somehow pull the data from its meter....

Do you have further ideas for version 2.0?


I'm not so sure about a high quality back-facing camera on the iPad, but we can hope! The form factor is pretty large for a point and shoot, and it could eat into their iPhone market... they may limit the camera to the smaller devices just to encourage people to own both an iPad and an iPod/iPhone. One can always hope though!

I'm more excited about a retina display iPad, and a 7" iPad. Those would be COOOOL.

I've already listed my ideas for new features above a couple times, so I'll let you go find them. I also don't think using the internal iPhone/possible iPad camera meter is feasible for accurate metering, I've discuss that in detail above as well. :)

Thanks for the support, and be sure to let me know how you like the iPad version when it comes out! I put a lot of (useful!) eye candy in that one, I'm quite proud of it. :)
11/19/2010 12:18:12 PM · #38
Originally posted by TrollMan:

Originally posted by Mousie:

Originally posted by TrollMan:

I just bought the payware version. I'll let you know how it works out for me. I only shoot manually BTW.


I'd love to hear your feedback. Unfortunately the new release does not ship until Apple drops iOS 4.2, but that should happen sometime this month, and you'll get the update for free of course. :)

I'm an apple developer myself so I've had 4.2 for quite a while. But I believe the official release of iTunes to support it came out yesterday. At least her in Norway. So not long now :)


Oh sweet! Go team! Do you mind linking us to your Apps? I'd be curious to see the work of a fellow developer.

I don't know if this is the right place for it, but I also wouldn't mind discussing your iOS development experiences. Personally, my stuff is currently at the level of a hobby that pays for itself (woot Mac Pro!) but I don't do it as part of my actual programming career. I haven't even talked to anyone who does app development for a company, actually! So, how'd you get into it, what do you do, and who do you do it for? :)
11/19/2010 12:58:57 PM · #39
I'm not a commercial Apple developer either. This is only hobby for me as well although I have been doing programming as a profession many years ago. But I am registered an an Apple developer and do get the beta releases. I have developed 4 apps, but only for personal use and not in the app store. They are amateur radio related and not useable for many since it's for Norwegian repeater stations and satellite navigation.

Right now I'm working on an app with in-app SMS so that I can remotely control the timed cabin heater in my car. The SDK is lacking a few things in this department at the moment, but I've heard it will soon get better.
11/19/2010 01:21:50 PM · #40
Originally posted by TrollMan:

I'm not a commercial Apple developer either. This is only hobby for me as well although I have been doing programming as a profession many years ago. But I am registered an an Apple developer and do get the beta releases. I have developed 4 apps, but only for personal use and not in the app store. They are amateur radio related and not useable for many since it's for Norwegian repeater stations and satellite navigation.

Right now I'm working on an app with in-app SMS so that I can remotely control the timed cabin heater in my car. The SDK is lacking a few things in this department at the moment, but I've heard it will soon get better.


Might I suggest publishing them anyway? Even if only 10 people could use them, why not? You've done the work and paid for the license... as long as it's not buggy, put them out there and see what happens! You might even receive suggestions for easy ways to improve upon them to make them useful for more people. :)

Heck, I published a cat petting simulator, iPurr. You pet it, it purrs. The more you pet, the more excited it sounds. This actually makes money! A pittance admittedly, but better than nothing, particularly for an app that was basically meant to teach me how to program audio and touch interactions. You might be surprised. That silly app alone pays for my developer fees, yearly.

Who knows how many Norwegian amateur radio fans have iPhones? I certainly don't... and I'd be really tempted to find out by publishing! Honestly that's the real draw for me... it's kind of like a low-stakes marketing/market analysis game I can play in my spare time. I doubt I could make enough to live on this without a lot more commitment... but it sure is fun seeing how many people are interested in my stuff. I check my numbers all the time... it's like tracking DPC challenge scores!

Really, what do you have to lose?
11/19/2010 09:35:28 PM · #41
Originally posted by Mousie:


Ahh... I get it! You feel Java has some issues, and that when comparing it to Obj-C it's inferior somehow, or that it reflects poorly on Andriod development.

Actually, I just found it a bit strange to state that iOS uses obj-c and C, and then go on to state that Android uses Java, as for me it implied that with iOS you have a choice whereas with Android you're stuck with what you get ;)

Originally posted by Mousie:


I don't really see it that way. The days of Java being slow are long, long gone. I write enterprise storage administration software in Java (and Java converted to JavaScript via GWT (the horror!))... if anything I'd suggest that Java has significant advantages over Obj-C for the overwhelming majority of app development, which is mainly consumer-targeted GUIs in these mobile contexts.

I haven't set my foot in obj-c, actually, being put off by the syntax you mention further down in your post. But I'd happily argue that Java is infinitely better suited for GUI code than plain C any day. That doesn't mean it can always stack up performance-wise though... ;)

Java converted to Javascript sounds just horrible by the way! I sort of like java/ecmascript, and have been using Nokia/Trolltech's QtScriptEngine a lot although mostly from the C++ side of things (creating wrappers for existing C++ classes etc). Still, autogenerated code is never fun to work with, and especially not when it is for an interpreter with no compile-time checks!

(snip)

Originally posted by Mousie:


Java may have a bad reputation, but I feel it's undeserved at this point. I find that most of the time it feels just like C (with an object syntax) but with less manual maintenance and better libraries/toolkits. I think Java is the most important feature of Android, a really smart choice on their part, and is going to lead to it's eventual adoption in many more places than you currently see iOS. Obj-C is a hurdle to adoption in a way that Java is not.


Interesting to hear your opinion. I completely agree that Java's bad reputation is undeserved, although that seems to be improving now. To me, Java feels like C++ with garbage collection, and as you say with better libraries.

I happen to work mostly with C++ and Qt (plus an in-house 3D scenegraph library), and I have to say that Qt really goes a long way towards solving those problems in C++. The library supports practically everything GUI-related and also has some great utility classes like a proper string implementation, lots of collection classes, database connectivity, filesystem, networking, threading etc. Plus, most of the classes are built on QObject, which provides stuff like reflection, dynamic fields and parent-child-based memory management. Oh and did I mention it is open source and multi-platform? :D

Although Java definitely has its strengths, I'd say that Qt eats up a lot of the advantage Java used to have over C++... Now to find time to try out the Android port some day...

Anywho, back to smart phones, in the end both platforms (Android and iOS) are obviously great, and from a technical standpoint quite similar (touchscreen, ARM CPU, flash storage, gps, 3G connectivity, accelerometer etc). I don't believe it will be the technical features (and especially not the choice of development language) that will dictate the outcome but rather the marketing. At the moment, it seems Apple has won over the "easy to use"-crowd whereas Google is standing strong with the "open standards (but we still want flash)"-crowd, and the way I see it it could easily go both ways.
11/20/2010 01:28:15 AM · #42
Originally posted by Mousie:

... freaks using a pinhole camera to shoot the moon through obsidian.


One of the funniest things I've ever read!!!! LOL!!

Congrats on the app, btw. I'll just have to wait until someone buys you an android :-)
11/20/2010 10:06:51 AM · #43
I already bought this.
11/21/2010 11:09:49 AM · #44
Originally posted by wiesener:

I happen to work mostly with C++ and Qt (plus an in-house 3D scenegraph library), and I have to say that Qt really goes a long way towards solving those problems in C++. The library supports practically everything GUI-related and also has some great utility classes like a proper string implementation, lots of collection classes, database connectivity, filesystem, networking, threading etc. Plus, most of the classes are built on QObject, which provides stuff like reflection, dynamic fields and parent-child-based memory management. Oh and did I mention it is open source and multi-platform? :D

Although Java definitely has its strengths, I'd say that Qt eats up a lot of the advantage Java used to have over C++... Now to find time to try out the Android port some day...

Anywho, back to smart phones, in the end both platforms (Android and iOS) are obviously great, and from a technical standpoint quite similar (touchscreen, ARM CPU, flash storage, gps, 3G connectivity, accelerometer etc). I don't believe it will be the technical features (and especially not the choice of development language) that will dictate the outcome but rather the marketing. At the moment, it seems Apple has won over the "easy to use"-crowd whereas Google is standing strong with the "open standards (but we still want flash)"-crowd, and the way I see it it could easily go both ways.


C++ was my first true programming love... though I loved it for all the wrong reasons. I'm a big fan of operator overloading for calculating sprite geometry... adding points with + makes sense. Unioning and intersecting rectangles with || and && makes sense! I also love love LOVE the idea of references as non-null pointers... if you start early and are strict you can add a ton of robustness to an API by guaranteeing your return values and pre-scrubbing incoming parameters. So cool!!! I'll have to take a peek at Qt. The last time I did anything serious in C++ was in the 90's when STL ruled. :)

Of course, Java strips out all the coolest, most dangerous features of C++... it doesn't even have a pointer syntax! That's why I think of it as more like C.

To be clear, I don't think it's specific development features or of-the-moment technical advantages that will make one phone or the other do well... in my opinion it's the quality of the user experience that does that. The reason I've suggested that choosing Java will eventually lead to a wider adoption of Android is that despite Apple's current head start (actual lead time, plus iOS development toolkits that easily make an app feel very slick) I'm predicting that this advantage is going to shrink and even reverse over time... the simple fact is that Obj-C is unusual and arcane, and way, way, WAY more people can/want to program in Java. Given the larger developer base and the more open nature of development, it's only a matter of time before Andriod phones will have a corresponding larger base of better-written software, unless something really terrible goes awry. Better software equals more users, and there are simply more people capable of improving both the Android framework and the applications written for it.

Choosing Java as the primary language is part of Android marketing. It says "come write for us!" in a way Obj-C never can. Obj-C says "Sell your soul to Mr. Jobs, because ain't nobody else usin' that skill!"

So it's not the development tools... it's the quantity of quality software coming out of those development tools that counts, and I have a hard time seeing Obj-C competing in this regard in the long term, as much as I adore it. I mean, do YOU want to learn Obj-C, when you have experience in Java and could capture similar market share in a ecosystem you already understand how to write code for? All you have to do today is crack a book on Android APIs and you're golden. Do you really want to absorb a radically different syntax, with a whole new suite of unknowns? That's a big hurdle to leap!

And that's the genius of Google's decision, IMO. They've significantly lowered the barrier to entry, and that will have long term consequences. I doubt either platform is going away any time soon, they're both pretty great like you've suggested... but unless Apple makes it easier for "non-niche" engineers to develop for their systems (and their resistance towards allowing anything non-native on their platforms indicates this isn't a huge concern for them) they risk reproducing another OS X vs. Windows situation. Arguably better, but with less adoption. :)

Thanks for sharing your opinions yourself! It's always fun to talk to another programmer, particularly one who seems to know their stuff and doesn't mind pushing back a bit. We're so darn opinionated! You've definitely given me some stuff to think about! :D
11/21/2010 11:34:06 AM · #45
Oh hey! May as well mention it here:

FREE FREE FREE FREE FREE FREE FREE!!!

PixelSwarm, my trippy particle system toy, is FREE for the week of Thanksgiving, to give thanks for this hobby paying for itself! Thanks y'all! :)

Yeah, it's off topic... but it's freakin' FREE!!!

Grab it while you can!

Message edited by author 2010-11-21 11:37:52.
11/21/2010 12:00:25 PM · #46
I don't use the video function of my camera, however, why specialize with just still imagery when video is widely used from many people with their gear.

Open up your market share and include Frames Per Second to calculate proper exposure.

Just a thought.
11/22/2010 05:27:21 PM · #47
Originally posted by Man_Called_Horse:

I don't use the video function of my camera, however, why specialize with just still imagery when video is widely used from many people with their gear.

Open up your market share and include Frames Per Second to calculate proper exposure.

Just a thought.


This sounds interesting! Do you have a pointer where I could look at how to calculate to compensate for frames per second? Is there some standard math people use? I haven't seen this before, but heck, if I could make it work for video too that would be great!
11/22/2010 05:32:45 PM · #48
iOS 4.2 has shipped and and Expositor 2.0 just went live! It's in the store right now, I checked!

For any of you that already own a copy and have an iPad, I'm super excited to hear your feedback!
11/26/2010 08:29:26 PM · #49
Originally posted by Mousie:

Originally posted by Man_Called_Horse:

I don't use the video function of my camera, however, why specialize with just still imagery when video is widely used from many people with their gear.

Open up your market share and include Frames Per Second to calculate proper exposure.

Just a thought.


This sounds interesting! Do you have a pointer where I could look at how to calculate to compensate for frames per second? Is there some standard math people use? I haven't seen this before, but heck, if I could make it work for video too that would be great!


just like still photography, moving images have their own dynamic trade offs. Everything is on the 'depends' status.

24 fpm, ISO 100, 50 mm lens with no filter the lighting can be a 4.8f to as much as a 5.6f. ANY of the parameters that change, changes your math. The more fpm increases you lighting needs, and visa versa when slowing your fpm.

The math changes with the type of lens, the sensitivity of the sensor, if your using filters, blah, blah.

Sounds like you have a handle on the still side, take the time to do some experiments...it's easy enough.
11/26/2010 11:47:26 PM · #50
I am finding it funny every time there is a "Make it for Android" comments (here and on other sites). I have been using Apple/Macs since 1982, and it has always been, "Sweet software!!! OH! Not for Mac, and never will be because they don't want to do the work to port it over." Turnabout is fair play. Not to mention, revenge is sweet! LOL
Pages:  
Current Server Time: 12/01/2020 01:09:06 AM

Please log in or register to post to the forums.


Home - Challenges - Community - League - Photos - Cameras - Lenses - Learn - Prints! - Help - Terms of Use - Privacy - Top ^
DPChallenge, and website content and design, Copyright © 2001-2020 Challenging Technologies, LLC.
All digital photo copyrights belong to the photographers and may not be used without permission.
Proudly hosted by Sargasso Networks. Current Server Time: 12/01/2020 01:09:06 AM EST.