Discussion:
Needs to know everything; was: RE: [AM] ZZZ
Paul Oldfield
2004-02-04 22:17:36 UTC
Permalink
(responding to Dagna)
EVERYONE NEEDS TO KNOW EVERYTHING!
Anyway, the basic problem from my point of view is that
I'm not sure I understand. I may be good at 'agility' but I
am not a DBA.
DG: Unless you need to know how the DBMS works, you don't
need to spend too much time discovering the fine details
(although a general understanding is essential, IMHO).
Actually I helped develop 2 DBMS - doesn't mean I know how
to use the things, though ;-)
That's what the DBAs are paid for - and part of their job
should be helping people who don't need to know the details,
work stuff out that they need to do.
Are you *sure* you work with data? Or were you just about
to add the "but not until 6 weeks after they ask for help"? ;-)
I firmly believe that what we all need is the ability to understand
the other person's concerns and viewpoint.
Definitely. That's my take on the "Generalising Specialist"
idea - you should know enough about the work of anyone
you're communicating with to understand what they are
saying - ideally you should be able to *do* the basic
elements of their work.
Even if we don't agree with it (it makes arguing and persuading
much easier). Next time a DBA has a hissy fit and refuses point
blank to even consider doing x - ask them why, and explain
that if you understand the reasons it will make both your
lives much easier in future. And when they have explained, you
can maybe rephrase your request or redo it in a way that won't
bring western civilisation down by the end of the day. (As the
aforementioned DBA explained would be the inevitable
consequence of your new column.)
I think what may fall through the cracks here are the cases where
the DBA gives explanations that are valid in the good old
fashioned traditional environment, but no longer hold true in
the agile world. I guess if the DBA doesn't know how to do it
in an agile fashion we are stymied anyway - but the ability to
say there's another way *would* be handy. The DBA might
just consider learning how to do it the agile way.
And, sometimes, someone is horribly overworked, incompetent,
in a bad mood, lazy, or just hates you.
What, after all the new schema and migration script stuff I
gave him to help him with the changes we need? How could
that be? We nearly did his work for him! :-)
My thoughts are that if everyone went through stored
procedures, then you'd only have them as clients of the
database. I guess the picture isn't so simple for some
reason, but why not?
DG: If every new development uses stored procedures, then
life will get easier.
So, are they? I saw embedded SQL being written only last
year, and I expect to see (and moan about) more for a few
years yet.
And, eventually, all the old apps will get replaced (the tape
drives will get too expensive to maintain, the people who write
assembler will come up to retirement and no-one will want to
take a job using it, the companies will get taken over and the
either the take overer or the take overee will have a spiffy new
system and there will be a corporate will to move from a
dozen billing systems to only two or three, and then the old data
will get migrated or dumped. And there will be world peace. <g>)
Hmm... let me paraphrase... The old apps will get replaced by
carbon copies but in procedural Java not Cobol or assembler;
Somebody will build a next-generation tape drive because
there are too many systems needing the hardware; The assembler
programmers will retire before the programs are replaced, and
nobody knows what they do, so we need to build an assembler
emulator so they keep on working when the mainframe goes;
the companies will choose the worst of the systems because the
modern systems are easy to replace but the bad ones are too
hard to replace; the old data will get older and dirtier, but that's
all we've got... cynical, me?
And one day, an old hack will be someone who remembers
Java. (Currently, an 'old hack' is likely to be someone who
remembers paper tape, punch cards, and that you could play
Smoke on the Water on mainframe core memory.)
I never played Smoke on the Water on mainframe core memory,
you must be thinking of somebody else. Pete? The beads
hurt my teeth. Was I doing it wrong?


Paul Oldfield

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
www.aptprocess.com

any opinions expressed herein are not necessarily those of
Mentors of Cally or the Appropriate Process Movement
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

For more information about AM, visit the Agile Modeling Home Page at www.agilemodeling.com
--^----------------------------------------------------------------
This email was sent to: gcma-***@gmane.org

EASY UNSUBSCRIBE click here: http://topica.com/u/?bUrKDA.bWnbtk.Z2NtYS1h
Or send an email to: agilemodeling-***@topica.com

TOPICA - Start your own email discussion group. FREE!
http://www.topica.com/partner/tag02/create/index2.html
--^----------------------------------------------------------------
Gaythorpe, Dagna
2004-02-09 15:07:15 UTC
Permalink
-----Original Message-----
Sent: 04 February 2004 22:18
Subject: Needs to know everything; was: RE: [AM] ZZZ
(responding to Dagna)
EVERYONE NEEDS TO KNOW EVERYTHING!
That's what the DBAs are paid for - and part of their job should be
helping people who don't need to know the details, work
stuff out that
they need to do.
Are you *sure* you work with data? Or were you just about
to add the "but not until 6 weeks after they ask for help"? ;-)
DG: Yes, I work with data - but mostly at the metadata level. Which, IM(not
so)HO, covers everything from the Enterprise Conceptual Model to tuning the
databases. (At least, knowing what to look out for and what the dbas can
reasonably be asked to do, and not designing the thing to make it too hard
for them).

<snip>
I think what may fall through the cracks here are the cases where
the DBA gives explanations that are valid in the good old
fashioned traditional environment, but no longer hold true in
the agile world. I guess if the DBA doesn't know how to do
it in an agile fashion we are stymied anyway - but the
ability to say there's another way *would* be handy. The DBA
might just consider learning how to do it the agile way.
I think that this is one of the problem areas - the dba may not see it as
'traditional' or 'agile', but as 'risk'. I am not entirely convinced that
the Agile method can be extended beyond software development too easily -
one of the things I am on this list to learn. And if it can't, then the
architects have to learn how to work with the agile developers and the dbas
so as to minimise the pain felt all round.

And we are back to the need to come up with a generic, flexible and very
granular design very quickly, at the start of the project, which is why that
great, monolithic, 120-entity Logical Enterprise Data Model, with all its
definitions and formats and cross-references and supporting stuff, is so
valuable. Once that is done, the any part of the initial design that has
been done before is just picked out of that, and the new bits added - my
average time at this point is a one-hour meeting at the start of the project
where I get told what the project does, the relevant bits of the enterprise
logical model are drawn up on the white board, and then any new stuff is
added in and agreed. Then I go and extract the basic project model, add the
new stuff, and hand it over. Then go through it with the dba, if he wasn't
at the original meeting. Total time can be less than half a day for a new
operational/processing application. (And the tools I use can generate the
scripts to create the db). Then we just have to find somewhere to put it. Go
buy a new server, maybe, work out the space and stuff, do we need new
licences... Maybe some of the problems with dbas are caused by the drag of
the hardware?

A thought about the problem - maybe data is not agile because it is the
trail left (or the framework needed) by the execution of business processes
and policy - and the data can only be as agile as the things it records.
Which is often pretty staid.

Dagna

Dagna Gaythorpe
Data Architect
International IT
tttt COLT TELECOM GROUP PLC
Beaufort House, 15 St Botolph Street,
London EC3A 7QN
t (+ 44) 020 7 390 7896
f (+ 44) 020 7 947 1176
e ***@colt-telecom.com




*************************************************************************************
COLT Telecommunications
Registered in England No. 2452736
Registered Office: Beaufort House, 15 St. Botolph Street, London, EC3A 7QN
Tel. +44 20 7390 3900


This message is subject to and does not create or vary any contractual
relationship between COLT Telecommunications, its subsidiaries or
affiliates ("COLT") and you. Internet communications are not secure
and therefore COLT does not accept legal responsibility for the
contents of this message. Any view or opinions expressed are those of
the author. The message is intended for the addressee only and its
contents and any attached files are strictly confidential. If you have
received it in error, please telephone the number above. Thank you.
*************************************************************************************

For more information about AM, visit the Agile Modeling Home Page at www.agilemodeling.com
--^----------------------------------------------------------------
This email was sent to: gcma-***@gmane.org

EASY UNSUBSCRIBE click here: http://topica.com/u/?bUrKDA.bWnbtk.Z2NtYS1h
Or send an email to: agilemodeling-***@topica.com

TOPICA - Start your own email discussion group. FREE!
http://www.topica.com/partner/tag02/create/index2.html
--^----------------------------------------------------------------
p***@aol.com
2004-02-09 16:18:04 UTC
Permalink
In a message dated 2/9/2004 7:13:27 AM Pacific Standard Time,
Post by Gaythorpe, Dagna
A thought about the problem - maybe data is not agile because it is the
trail left (or the framework needed) by the execution of business processes and
policy - and the data can only be as agile as the things it records. Which is
often pretty staid.
Dagna:

Excellent points, particularly that of defining the DBAs perspctive of risk.
This is something we should be cogniant of when dealing with the data guys.
They see software in an entirely different light than developers. Further, their
anxieties typically grow exponentially with the length of time the system has
been in use. We have often described the problems of having to maintain
software developed long before we get involved with it: the lack of documentation
and knowlege, etc. The same is true of the data guys: DBAs come and go, but the
data (an important business asset) remains. While the risks associated with
modifications to the software can be substantial, programmers have some degree
of controlling that risk. Modifications to the structure of the data (schemas,
loss or corruption of the data, not getting everything changed and back in
sync, adverse effects on applications, etc.) may be perceived as posing a
greater risk, at least to the DBAs, as well as management,

Regards,

Pete

For more information about AM, visit the Agile Modeling Home Page at www.agilemodeling.com
--^----------------------------------------------------------------
This email was sent to: gcma-***@gmane.org

EASY UNSUBSCRIBE click here: http://topica.com/u/?bUrKDA.bWnbtk.Z2NtYS1h
Or send an email to: agilemodeling-***@topica.com

TOPICA - Start your own email discussion group. FREE!
http://www.topica.com/partner/tag02/create/index2.html
--^----------------------------------------------------------------
Paul Oldfield
2004-02-09 17:42:56 UTC
Permalink
(responding to Dagna)

EVERYONE NEEDS TO KNOW EVERYTHING!
(Paul)
I think what may fall through the cracks here are the cases where
the DBA gives explanations that are valid in the good old
fashioned traditional environment, but no longer hold true in
the agile world. I guess if the DBA doesn't know how to do
it in an agile fashion we are stymied anyway - but the
ability to say there's another way *would* be handy. The DBA
might just consider learning how to do it the agile way.
(Dagna)
I think that this is one of the problem areas - the dba may
not see it as 'traditional' or 'agile', but as 'risk'.
That's fair enough, as long as all risks are evaluated fairly,
including the risk of being unable to respond to change
in a timely manner when it needs to happen.
I am not entirely convinced that the Agile method can be
extended beyond software development too easily -
one of the things I am on this list to learn. And if it can't,
then the architects have to learn how to work with the agile
developers and the dbas so as to minimise the pain felt
all round.
Agreed. Programming used to have the same problem
of change being extremely painful, and one of the first
solutions was to encapsulate the data and put it and
behaviour behind interfaces, so that the ripple effect
was stopped. That's now standard in programming,
but not yet in database. I think that must be the first step
toward agility. Yet it won't be as easy as it was for
programming, because 'we' didn't have data migration
problems; 'you' have always dealt with that for us. OTOH
'you' have 'our' experience to learn from.
And we are back to the need to come up with a generic,
flexible and very granular design very quickly, at the start of
the project, which is why that great, monolithic, 120-entity
Logical Enterprise Data Model, with all its definitions and
formats and cross-references and supporting stuff, is so
valuable.
Well, I'd start with a Domain Model... I think there are a few
basic mismatches between an Object model and a 3NF
relational data model. The ones that spring to my
relatively untutored mind are the inheritance problem
and the reference / foreign key problem. It seems to me
that some sort of compromise should be possible, but
the solution may be biased different ways in different
cases. But what should determine how the solution is
biased?

There's also a question as to whether there's any reason
to have all those 120 entities in the same database or
whether the domain concepts would be better partitioned
into subject areas - I've seen both approaches but don't
know enough to say which approach is best in what
circumstance.
Once that is done, the any part of the initial design that has
been done before is just picked out of that, and the new bits
added - my average time at this point is a one-hour meeting
at the start of the project where I get told what the project
does, the relevant bits of the enterprise logical model are
drawn up on the white board, and then any new stuff is
added in and agreed. Then I go and extract the basic project
model, add the new stuff, and hand it over. Then go through
it with the dba, if he wasn't at the original meeting. Total time
can be less than half a day for a new operational/processing
application. (And the tools I use can generate the scripts to
create the db). Then we just have to find somewhere to put it.
Go buy a new server, maybe, work out the space and stuff,
do we need new licences... Maybe some of the problems
with dbas are caused by the drag of the hardware?
I think this is about the right time to mention the differences
between Data-rich, Behaviour-rich and Control-rich
systems. Programmers will already be throwing up their
hands in horror at your suggested approach, and maybe
I can explain why, before anyone starts casting aspertions ;-)

In a data-rich system, the structure of the data drives the
problem, and though programmers would start with an object
model, starting with a data model would give perfectly
adequate results. For behaviour-rich systems, the behaviour
of the objects is significant, and if we are to have any
reasonable chance at building the system economically
and in a way that can respond readily to change, then
correct apportionment of behaviour to objects is paramount.
the data needed is determined by the behaviour needed.
In control-rich systems, behaviour is also important, but
we have the added complication that the behaviour of
objects changes significantly with time.

In reality, encapsulation of the data will save the day,
we hope, because the persistence layer can map
between the business object representation and the
underlying data representation, should they need to
differ. Yet awkward mappings can be performance
hogs.

(Phew - did I smooth all the ruffled feathers?)
A thought about the problem - maybe data is not agile
because it is the trail left (or the framework needed) by the
execution of business processes and policy - and the data
can only be as agile as the things it records.
Which is often pretty staid.
That's probably not relevant - I can envisage cases
where the same data is used by a 30-year-old core
business dinosaur and an ultra-modern agile added
value miracle (carefully displaying extreme bias ;-) ).


Paul Oldfield

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
www.aptprocess.com

any opinions expressed herein are not necessarily those of
Mentors of Cally or the Appropriate Process Movement
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

For more information about AM, visit the Agile Modeling Home Page at www.agilemodeling.com
--^----------------------------------------------------------------
This email was sent to: gcma-***@gmane.org

EASY UNSUBSCRIBE click here: http://topica.com/u/?bUrKDA.bWnbtk.Z2NtYS1h
Or send an email to: agilemodeling-***@topica.com

TOPICA - Start your own email discussion group. FREE!
http://www.topica.com/partner/tag02/create/index2.html
--^----------------------------------------------------------------
Gaythorpe, Dagna
2004-02-10 16:41:56 UTC
Permalink
(Responding to Paul)
EVERYONE NEEDS TO KNOW EVERYTHING!
(Dagna)
I think that this is one of the problem areas - the dba may
not see it
as 'traditional' or 'agile', but as 'risk'.
(Paul)
That's fair enough, as long as all risks are evaluated
fairly, including the risk of being unable to respond to
change in a timely manner when it needs to happen.
DG: One way of reducing that risk would be to have a dba involved in the
project (in my youth, I made sure my grandmother was well up to date in the
latest egg-sucking techniques... <g>). While all the dbas I have ever
encountered have been obliging, flexible, co-operative team players (was
that a snigger at the back?), some tend to refuse everything or set up great
fences to be got over, to stop people messing up 'their' data. Or so I am
told. ;-) But if you can get them involved (and get over the two standard
responses - "you should have involved me sooner" and "why are you dragging
me in? I don't need to be involved yet!"), then 'your' data is also 'theirs'
- which helps a lot. If this sounds like I am anti-dba - apologies, I am
not, and I do appreciate how much they have to do. But I have also suffered
from their attitudes.

(Paul)
Agreed. Programming used to have the same problem
of change being extremely painful, and one of the first
solutions was to encapsulate the data and put it and
behaviour behind interfaces, so that the ripple effect was
stopped. That's now standard in programming, but not yet in
database. I think that must be the first step toward
agility. Yet it won't be as easy as it was for programming,
because 'we' didn't have data migration problems; 'you' have
always dealt with that for us. OTOH 'you' have 'our'
experience to learn from.
DG: This one took some thought. And will do for some time, I think.

First thoughts - if it has been encapsulated, then surely that works both
ways? Answer - no, it almost certainly doesn't. If a data change is needed,
then encapsulation can protect those apps, calls and so on that don't use
the new structure (assuming that the old one is still supported). If the
underlying data remains the same, and maybe a new index or whatever is
added, or a table split into master and detail, then the outside world needs
to know nothing. But if the format or contents of a field changes, or the
field is no longer used, or a new one appears, then there is an impact on
the outside world. Especially if it is something like the infamous 'name
field split' scenario, which may require a change to the way the apps
present the name to the database for insertion and update.

Second thought - how about views? But they are a form of encapsulation or
the underlying tables, and the same applies.

Third thought - there is an amount of pain that is going to occur somewhere,
and 'you' pushed it 'our' way? That was mean of you. ;-> (See also Function
Point Analysis, if you ever can't sleep. Something that works (I have seen
it work), but makes "BDUF incorporating every little thing that anyone may
want to consider for the next ten years" look like prototyping... But it is
a way of measuring the amount of work needed to achieve a design, and (it is
a long time since I read the book, and I forget the details) if a set amount
of work has to be done, then I guess it gets done somewhere.

I am going to ponder this one some more. A few train journeys, glasses of
wine, sleepless nights (not that I *intend* to lose sleep over it, but
those nights when I can't sleep anyway), then I will see if I can come up
with something to float at dm-discuss, and bring it back here.
And we are back to the need to come up with a generic, flexible and
very granular design very quickly, at the start of the
project, which
is why that great, monolithic, 120-entity Logical Enterprise Data
Model, with all its definitions and formats and
cross-references and
supporting stuff, is so valuable.
<snip>

(Paul)
There's also a question as to whether there's any reason
to have all those 120 entities in the same database or
whether the domain concepts would be better partitioned
into subject areas - I've seen both approaches but don't
know enough to say which approach is best in what
circumstance.
I keep them together, because otherwise the overlaps are a impossible to
keep track of, but break them into subject areas. This being the Enterprise
Logical Data Model, it should reflect the underlying data structures of the
Enterprise, not of any individual application or database. The idea being
that any app or db can be replaced without affecting the ELDM. (That's the
theory, anyway.) It is also (I think, probably) a level of abstraction up
from any DB logical models. Including any warehouses you may have cluttering
the place up, so any attempts to use the ELDM as a db design without a close
review and (likely) expansion should be resisted. Firmly. To the last drop
of blood of the person making the suggestion. (As you can see, data
architects are a friendly, approachable bunch of people.)


(Paul)
I think this is about the right time to mention the
differences between Data-rich, Behaviour-rich and
Control-rich systems. Programmers will already be throwing
up their hands in horror at your suggested approach, and
maybe I can explain why, before anyone starts casting aspersions ;-)
DG: Yes, please.

(Paul)
In a data-rich system, the structure of the data drives the
problem, and though programmers would start with an object
model, starting with a data model would give perfectly
adequate results. For behaviour-rich systems, the behaviour
of the objects is significant, and if we are to have any
reasonable chance at building the system economically
and in a way that can respond readily to change, then
correct apportionment of behaviour to objects is paramount.
the data needed is determined by the behaviour needed.
In control-rich systems, behaviour is also important, but
we have the added complication that the behaviour of
objects changes significantly with time.
In reality, encapsulation of the data will save the day,
we hope, because the persistence layer can map
between the business object representation and the
underlying data representation, should they need to
differ. Yet awkward mappings can be performance
hogs.
(Phew - did I smooth all the ruffled feathers?)
DG: I hope this doesn't come across as a really stupid question... I think
it might, but please remember that I am a poor, confused, data dinosaur.
Ahem.

How do you encapsulate the data when you then tie it up into objects? Can
you split it out under those conditions? Or have I totally missed the point
of objects? (I am quite prepared to accept that I have, but the reference to
the object having behaviours seems to indicate that an object involves both
the data and the processes it performs or has performed on it, which is my
original understanding. Unless the object has no data. Help!)
A thought about the problem - maybe data is not agile
because it is the trail left (or the framework needed) by the
execution of business processes and policy - and the data
can only be
as agile as the things it records. Which is often pretty staid.
(Paul)
That's probably not relevant - I can envisage cases
where the same data is used by a 30-year-old core
business dinosaur and an ultra-modern agile added
value miracle (carefully displaying extreme bias ;-) ).
The 30-year-old core dinosaur being closely coupled while the young
whippersnapper goes through the APIs? (Biased? Me??) If only - can the young
upstart access the 30 year old file system? Or does it get an extract and
build its own version? (Replication! Duplication! *Bad* agile thingy...
<eg>)

Dagna
Dagna Gaythorpe
Data Architect
International IT
tttt COLT TELECOM GROUP PLC
Beaufort House, 15 St Botolph Street,
London EC3A 7QN
t (+ 44) 020 7 390 7896
f (+ 44) 020 7 947 1176
e ***@colt-telecom.com




*************************************************************************************
COLT Telecommunications
Registered in England No. 2452736
Registered Office: Beaufort House, 15 St. Botolph Street, London, EC3A 7QN
Tel. +44 20 7390 3900


This message is subject to and does not create or vary any contractual
relationship between COLT Telecommunications, its subsidiaries or
affiliates ("COLT") and you. Internet communications are not secure
and therefore COLT does not accept legal responsibility for the
contents of this message. Any view or opinions expressed are those of
the author. The message is intended for the addressee only and its
contents and any attached files are strictly confidential. If you have
received it in error, please telephone the number above. Thank you.
*************************************************************************************

For more information about AM, visit the Agile Modeling Home Page at www.agilemodeling.com
--^----------------------------------------------------------------
This email was sent to: gcma-***@gmane.org

EASY UNSUBSCRIBE click here: http://topica.com/u/?bUrKDA.bWnbtk.Z2NtYS1h
Or send an email to: agilemodeling-***@topica.com

TOPICA - Start your own email discussion group. FREE!
http://www.topica.com/partner/tag02/create/index2.html
--^----------------------------------------------------------------
Paul Oldfield
2004-02-12 12:32:26 UTC
Permalink
(Responding to Dagna)

First, sorry for the delay, was 400 miles from my machine
for a couple of days.

EVERYONE NEEDS TO KNOW EVERYTHING!
Post by Gaythorpe, Dagna
(Dagna)
... the dba may not see it as 'traditional' or 'agile', but as 'risk'.
(Paul)
That's fair enough, as long as all risks are evaluated
fairly, including the risk of being unable to respond to
change in a timely manner when it needs to happen.
DG: One way of reducing that risk would be to have a dba involved
in the project (in my youth, I made sure my grandmother was well
up to date in the latest egg-sucking techniques... <g>).
Agreed.
Post by Gaythorpe, Dagna
While all the dbas I have ever encountered have been obliging,
flexible, co-operative team players (was that a snigger at the
back?),
It was, ignore them, they don't know their own faults... ;-)
Post by Gaythorpe, Dagna
some tend to refuse everything or set up great fences to be
got over, to stop people messing up 'their' data. Or so I am
told. ;-) But if you can get them involved (and get over the two
standard responses - "you should have involved me sooner"
and "why are you dragging me in? I don't need to be involved
yet!"), then 'your' data is also 'theirs' - which helps a lot. If this
sounds like I am anti-dba - apologies, I am not, and I do
appreciate how much they have to do. But I have also suffered
from their attitudes.
Yes, if the whole of 'systems' is acting as one diversified
team, things will in theory progress a lot better. My take on
'Generalising Specialist' is that we should know at least the
basics of the other person's problems. If we get the DBA
involved early, we can get an early idea of which issues
need to be discussed with DBA, and which issues we don't
need to pass along for inspection and comment.
Post by Gaythorpe, Dagna
(Paul)
... one of the first solutions was to encapsulate the data and
put it and behaviour behind interfaces, so that the ripple effect
was stopped. That's now standard in programming, but not
yet in database....
DG: This one took some thought. And will do for some time, I think.
First thoughts - if it has been encapsulated, then surely that works
both ways? Answer - no, it almost certainly doesn't. If a data
change is needed, then encapsulation can protect those apps,
calls and so on that don't use the new structure (assuming that the
old one is still supported). If the underlying data remains the same,
and maybe a new index or whatever is added, or a table split into
master and detail, then the outside world needs to know nothing.
But if the format or contents of a field changes, or the field is no
longer used, or a new one appears, then there is an impact on
the outside world.
'We' have the view that the data is there primarily to support the
applications. I assume 'you' have the view that the data is there
primarily to support the business. Whereupon 'we' would say
"Yes, but the data supports the business by supporting the
appications that support the business". In theory, there should
never be any change that reduces the types of information held
in the database except where the business no longer needs
that information - and by extension no longer needs the applications
that use that information. Of course there will be granularity
probems... the bit of unwanted functionality may be hard to unpick
from wanted functionality.

If we consider the function of the encapsulating interface to be
mapping the information that an application needs or provides
to the data stored in the database, then in theory changing the
data structure, format, etc. should not cause changes that
ripple beyond the interface. However, there will be exceptions.
Suppose the database needs extra information when an
application is adding new information to the database - information
without which the database cannot store its new data 'correctly'.
Where this changes, the application now has a new requirement -
to supply this additional information.

I think this is an important way of looking at the situation; the
database is a stakeholder in the application development,
and conversely the application is a stakeholder in the database
development. Each supplies requirements to the other.
Post by Gaythorpe, Dagna
Especially if it is something like the infamous 'name field split'
scenario, which may require a change to the way the apps
present the name to the database for insertion and update.
Indeed. Here the database is supplying requirements to the
application, probably as a result of requirements it received
from a different application.
Post by Gaythorpe, Dagna
Second thought - how about views? But they are a form of
encapsulation or the underlying tables, and the same applies.
Agreed.
Post by Gaythorpe, Dagna
Third thought - there is an amount of pain that is going to occur
somewhere, and 'you' pushed it 'our' way? That was mean of
you. ;->
Hey, do you want hints on solutions too? ;-) If applications
developers hadn't done encapsulation first, 'you' might be
teaching 'us'. The pain exists, the solutions seem to exist. If
we're all part of the same team, we're being helpful, not mean.
(Should've been a spin doctor? ;-) )
Post by Gaythorpe, Dagna
(See also Function Point Analysis, if you ever can't sleep.
Something that works (I have seen it work), but makes "BDUF
incorporating every little thing that anyone may want to consider
for the next ten years" look like prototyping... But it is a way of
measuring the amount of work needed to achieve a design,
and (it is a long time since I read the book, and I forget the
details) if a set amount of work has to be done, then I guess
it gets done somewhere.
Right. Throw in another rule of thumb - if each bit of the work
gets done in the 'right place', then the system will be very
flexible. Unfortunately there are one or two things that throw this
nice picture out of kilter (If we could do all reporting from the object
layer rather than direct from the database, many things would be
much easier...).
Post by Gaythorpe, Dagna
I am going to ponder this one some more. A few train journeys,
glasses of wine, sleepless nights (not that I *intend* to lose
sleep over it, but those nights when I can't sleep anyway), then
I will see if I can come up with something to float at dm-discuss,
and bring it back here.
Sounds like a good idea.


<snip>
Post by Gaythorpe, Dagna
(Paul)
In a data-rich system, the structure of the data drives the
problem, and though programmers would start with an object
model, starting with a data model would give perfectly
adequate results. For behaviour-rich systems, the behaviour
of the objects is significant, and if we are to have any
reasonable chance at building the system economically
and in a way that can respond readily to change, then
correct apportionment of behaviour to objects is paramount.
the data needed is determined by the behaviour needed.
In control-rich systems, behaviour is also important, but
we have the added complication that the behaviour of
objects changes significantly with time.
In reality, encapsulation of the data will save the day,
we hope, because the persistence layer can map
between the business object representation and the
underlying data representation, should they need to
differ. Yet awkward mappings can be performance
hogs.
DG: I hope this doesn't come across as a really stupid question...
I think it might, but please remember that I am a poor, confused,
data dinosaur.
Ahem.
Agreed... Oops - make that "Okay" :-)
Post by Gaythorpe, Dagna
How do you encapsulate the data when you then tie it up into
objects? Can you split it out under those conditions? Or have
I totally missed the point of objects? (I am quite prepared to
accept that I have, but the reference to the object having
behaviours seems to indicate that an object involves both
the data and the processes it performs or has performed on it,
which is my original understanding. Unless the object has no
data. Help!)
It may be better thinking in terms of Information or Knowledge
rather than Data. Potentially, an object has both knowledge
and behaviour. Some objects have no knowledge at all,
they are totally stateless, they merely help other objects do
what they need to do with *their* knowledge. All objects have
behaviour, but in some cases this behaviour is so trivial
that it is nothing more than accepting and supplying
information.
Post by Gaythorpe, Dagna
From the applications programmer's point of view, this is
all that needs to happen - databases are unnecessary -
except for a coulpe of things that objects don't do very
well. The first is persistence. If the power gets turned off,
we don't want to lose all the information that the business needs.
The second is reporting. We could do reporting from the
object layer, but it tends to be much slower than doing it from
a relational database for many types of report.

As long as the object behaves in the way it is designed to behave,
and deals adequately with the type of information it has
responsibility for, it can do whatever it wants inside its
interface, such as storing the information as data in a database,
for example. It can change what it does behind the interface
with none of its clients being any the wiser (except where the
performance characteristics change). The responsibilities
that an object appears to support may be delegated to other
objects behind the interface - in this way the data can be
'split out', if I understand your question.
Post by Gaythorpe, Dagna
(Paul)
That's probably not relevant - I can envisage cases
where the same data is used by a 30-year-old core
business dinosaur and an ultra-modern agile added
value miracle (carefully displaying extreme bias ;-) ).
The 30-year-old core dinosaur being closely coupled while
the young whippersnapper goes through the APIs? (Biased?
Me??) If only - can the young upstart access the 30 year old
file system? Or does it get an extract and build its own version?
(Replication! Duplication! *Bad* agile thingy... <eg>)
In effect, any 'object' is an in-memory replica of a snippet
of the database (unless it contains no data that persists).
This causes problems, but we have transaction mechanisms
to cope with those problems, so the data doesn't get out of
step where that's important. Remember, data on its own is
pretty useless; to be useful there must be associated behaviour.
Sometimes this is mediated by the mind of a business person,
sometimes there's an application to do the 'donkey work'.
A modern application written by experienced programmers
would almost certainly have a 'data adaptor' or 'persistence'
layer that would translate between the 'ideal' format and the
'persistence' format, whatever that happened to be.

Personally, I wouldn't know how to go about accessing a
30 year old file system if it were on magnetic tape, unless
I were working in Cobol. I'm sure it could be done. I keep
thinking I'd transfer the tape to CD-Rom and chuck away
the tapes and tape drives - maybe that's naive?


Paul Oldfield

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
www.aptprocess.com

any opinions expressed herein are not necessarily those of
Mentors of Cally or the Appropriate Process Movement
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

For more information about AM, visit the Agile Modeling Home Page at www.agilemodeling.com
--^----------------------------------------------------------------
This email was sent to: gcma-***@gmane.org

EASY UNSUBSCRIBE click here: http://topica.com/u/?bUrKDA.bWnbtk.Z2NtYS1h
Or send an email to: agilemodeling-***@topica.com

TOPICA - Start your own email discussion group. FREE!
http://www.topica.com/partner/tag02/create/index2.html
--^----------------------------------------------------------------
Loading...