4

EDIT, 2020/09: If anyone is wondering, 12 years later, yes, we've all moved to JSON and Kubernetes by now. Original text follows.

Obviously, there is no one single solution that would satisfy everyone's needs; an architecture is always a trade-off. I want to create a framework, originally aimed at RAD of web games. Target language is PHP, although the architecture should be widely applicable.

Goals that I have in head for this framework are: flexibility in ways you can achieve the result; maximal comfort for developers; connecting modules like LEGO® blocks; many types of input, many types of output, one format for processing.

The goals that are not a priority are speed, enterprise use and making money. It's supposed to be an open source project.

The cornerstone of this design is that all content, before transformation, is processed in XML (idea based on EAI system I've worked with, eGate). The data abstraction layer - hopefully some smart ORM - is not important now. The output will be generated using XSLT or any other custom modules, for virtually any client - HTML for old browsers, XHTML/HTML5 for modern browsers, simple HTML for mobile clients, XML for AJAX/XMLRPC, etc.

Main reasons for using XML are:

  • it's a well-known standard
  • existing tools like XPath, SimpleXML and DOM for navigating and modifying the content
  • XSLT providing a powerful and unified way to transform the code into any tag soup
  • I find the XML markup very easily readable, therefore I don't think that advantages of JSON or YAML make a difference here
  • The content can be stacked easily, and it the order of the content doesn't really matter as long as it's transformed correctly with XSLT

The page generation process would consist of these phases:

  1. Pre-processing: initializing modules, processing GPCS data, applying default [XML] templates
  2. Processing/generation: main part of business logic, generating the bloated XML with maximum data (although hopefully optimized not to generate balast)
  3. Processing: some additional business logic, e.g. cutting down some of the markup, preparing for transformation, reporting, statistics, etc.
  4. Post-processing: parsing XML through the transformation engine (most probably just XSLT), output.

The content would be generated with a lot of meta-data (e.g. tags, permissions, importance, necessity, aimed output type), which would be stripped down during post-processing.

So, my question is: except for speed, what is the downfall of this solution? Where it could go wrong both during the development/maintenance of the framework, and its applications? What are the downsides of this architecture?

analytik
  • 792
  • 6
  • 16

5 Answers5

3

XSLT can be bulky to manage, and essentially adds an extra programming language that developers would have to work in (at least if I understand your description correctly). My experience has been that relatively few people know it, and even fewer can make it do what they want.

acrosman
  • 12,814
  • 10
  • 39
  • 55
  • My experience with XSLT has shown none of these drawbacks. – dacracot Oct 09 '08 at 21:17
  • You raise a valid point about amount of people mastering XSLT; there is some dichotomy in coupling PHP with XSLT, but it seems to be the only formal language for transforming XML. I'll probably want to offer an alternative to XSLT. Thanks. – analytik Oct 10 '08 at 01:19
2

I'm not sure what "flexible framework" to suggest to you either. It all kinda depends on what you're comfortable with and your personal taste.

One thing I do know is that, how appealing it may look at first, is to stay away from XSLT. Doing Hello World type stuff and simple examples with XSLT is pretty straight forward. However, more complex projects become completely unmanageable (not to mention unreadable) with XSLT. My experience is that it puts a massive strain on the project.

Luke
  • 20,878
  • 35
  • 119
  • 178
1

I think you are looking at a highly complicated solution. It is a major effort to simply design and build out the Schema's that you will be using. If your on a project that involves more than 5-6 people total you likely need to have an organized schema design effort. I think this is a point you are aware of.

I question and possibly take issue of the selection of PHP on the front end. I also think you are deciding on XML in a large sense mistakenly.

Here is what I do:

  1. Build a service layer using Grails.org
  2. Keep every resource that can be RESTFUL in rest
  3. Use the X-fire plugin in Grails to build out any SOAP services that need to be built
  4. Take advantage of GORM and RAD functionality of Grails to reduce development time.
  5. Construct clients in X or Y language or platform to consume these services.

I definetly would want the speed of pure Java doing all of my XML translation/processing. If you have large documents then it will take considerable time processing these.

You understand the forces of your environment better than anyone, but I would caution you to do the simplest thing that works first and not over-architect.

Daniel Honig
  • 4,268
  • 6
  • 26
  • 24
  • Thanks for the opinion. However, even when my idea is overly complex for a simple CMS, it won't involve lengthy document processing - I'm quite sure that processing my XML with XSLT in PHP will be much faster and a lot less complicated than calling a webservice that was built using YET another lang. – analytik Oct 10 '08 at 01:23
1

What you describe could likely be implemented using tox. It uses a hybrid MVC-ARS architecture. The obstacle that I see is the cost point tox presents because of its dependency on Oracle. Of course, since it is open source, you could convert it to Postgresql.

Community
  • 1
  • 1
dacracot
  • 22,002
  • 26
  • 104
  • 152
-2

I've worked on a few web games in the past and, to be honest, none of them have ever needed anything this complex and unwieldy.

Robert Rouse
  • 4,801
  • 1
  • 19
  • 19