Hello & Welcome, everyone!
Thank you for joining us for today's webinar, hosted by the LiMSforum.
My name is Dinah Ramirez and I'm your Moderator for today's webinar.
Today's Webinar "A Guide for Laboratory Systems Management is PART 4 of a webinar
series that's been presented by Joe Liscouski!
Today we'll be focusing on LIMS, ELNs, SDMS, IT & Education.
If you're joining us for the first time in the series, Joe is an experienced laboratory
automation/computing professional with over forty years experience in the field, including:
-the design and development of automation systems
- LIMS - robotics and data interchange standards
- and consults on the use of computing in lab work.
We're excited to have Joe with us here again for Part 4 in his series, so let's check-in
with Joe and we'll get started!
Alright, Joe, thanks for joining us!
The floor is yours.
Thanks and welcome to the fourth session in this series.
We'll be continuing to look at the factors in making a choice of central database systems,
the added concerns multi-laboratory environments can bring, and the role that information technology
support groups have in this process.
As we've noted in these webinars, this companion book will provide useful background information
on the technologies, support, and other factors that can impact your ability to effectively
use these systems.
It should prove to be a useful reference as the webinars get into more technical details.
Those details will become more apparent in this session, and I will be making references
to sections of the book for more information.
I'll also provide additional reference material at the end of this webinar.
The previous sessions have introduced the major
systems in laboratory informatics and looked at how the return on investment can be measured
evaluated.
In the last session, webinar 3, we began looking at the points that need to be considered when
choosing between a LIMS, ELN, or SDMS for a single lab, plus the documentation needed
to support those decisions.
At this point we are assuming that you've addressed the issue of what type of central
database system you are going to use as a target: LIMS, ELN, SDMS, or a combination
of them.
At this point we aren't looking at specific products, but rather product characteristics
that fit your workflow: large scale repetitive testing or laboratory work diary.
Our initial focus on the central database system as a starting point is based on a simple
premise: Multiple uncoordinated workflows from isolated workstations will leave you
with multiple sources of data and information that will at some point have to be integrated.
Without that integration, comprehensive lab-wide or even technique-wide data analysis and evaluation
will suffer significantly.
You are better off addressing this need as early as you can in your lab's life.
We looked at these points with regard to a single laboratory…
and now we're going to consider the points that might arise in a multi-laboratory environment,
differing implementation structures, and the role of IT support in this work.
Multiple lab situations can arise for a number of reasons: you may be working at a research
complex, or for a company or organization that has multiple campuses.
One laboratory may do testing in support of research and also develop test methods that
will be put to use in quality control labs.
The questions that we want to address are: - can multiple labs benefit from common product usage
and can they agree on one or more product characteristics that can lead them
to successfully choosing and implementing working systems?
The benefits of focusing on a single product or a small set of products include reduced
purchase costs.
Some informatics systems permit multiple independent datasets to run concurrently; you have the
ability to support several labs with a single license.
If you are buying software for several labs, you have better bargaining power.
The other benefits derive from support costs.
If the number of sites you are working with is large enough, the vendor may offer special
support considerations.
If your IT group is supporting your software, they will have an easier time supporting one
product set, particularly if it involves development work.
In addition, the learning curve for lab personnel will be smoother, and if people transfer between
departments, they will have access to systems that they are already familiar with.
Similarly, meeting the needs of regulatory compliance will be simplified.
All of this works if you don't have to compromise on meeting your laboratories needs.
You don't want to have to sacrifice important features or force-fit LIMS functionality into
an ELN.
However, products that support both workflows would be useful, particularly if needs evolve
one way or the other.
When we are dealing with systems like these, there are additional considerations that have
to be taken into account.
Among them are system backups and archives.
Both are designed to address an important problem: data loss and retrieval.
Everything we've covered so far has been concerned with getting data and information.
It has to be protected from loss as well.
System crash, there are floods, storms, electrical failures, and malware to take into account.
Protection against data loss includes backups and archives.
Backup provides short-term insurance against system failure and data loss.
They are copies, essentially a snapshot, of the entire system or major segments of it
depending on your policies.
Different portions of the system may be backed up at different frequencies.
A backup can be used to restore all or part of a system.
Archives are another form of backup and will include both actively used documents and historical
information organized so that it is easily searched and has the ability to retrieve the
contents, including older versions of documents.
Those have to be backed up as well.
This is a subject that could warrant a session of its own.
I mention it here because of the impact of these activities on IT support and how that
can be affected by the choices made in both product selection and use by multiple labs.
In case you are wondering, I have my work backed up or archived three different ways
including remote archives.
The backup is a current snapshot of the disk drives on the system.
The archives contain historical information including current and older versions of files.
A little paranoia goes a long way.
Why is this important to you?
The shift from paper-based systems to electronic media offers something you didn't have before:
protection against loss of laboratory work, plus the ability to easily re-organize it
and distribute it for use elsewhere.
It also raises a concern that you didn't have before: security against electronic theft
and malicious behavior including ransom-ware and other problems.
We are beginning to define the roles of an outside organization in laboratory work, that
of IT support.
One important characteristic of LIMS and ELNs designed for laboratory work is the ability
to connect instruments and data systems either directly to the database systems or through
an intermediate system like an SDMS.
This is an important selling point for these systems and a key component of improving productivity
and Return On Investment.
Instrument data can be automatically entered into the database and work lists can be sent
from the data-system to instrument/experiment workstations.
How significant these points are, and how they fit into your needs is going be a consideration
in the next few slides.
What we will be covering will have a direct bearing on multi-lab systems support and suitability.
How data and information are collected and moved around the laboratory, between data
sources and destinations, will have some bearing on how you prepare for centralized database
systems and where they are located.
The three primary modes of collection and communications are shown on the screen:
analog data capture with digital controls, serial communications, and Ethernet compliant systems.
These can range from instrument-computer combinations that are one-to-one or many-instruments-to-one-computer,
to devices with built in communications protocols and connectors.
We'll look at the implications for centralized database systems in the next few slides.
The output of an analog device can go to a meter, chart-recorder, or most commonly today,
a computer system.
The instrument (data source) has to be close to the computer to avoid cabling problems
as well as noise elimination.
Normally the computer will provide instrument control (including an autosampler) via digital
switches through a digital I/O card.
The computer provides the analysis of the data, with reporting and communications (usually
via Ethernet or WiFi) to an SDMS system or LIMS/LIS, ELN.
The distance between the instrument computer workstation and the instrument is dependent
on the nature of the control and data signals, acquisition speed, and the options for data
conversion over intermediate networked devices.
High speed data collection and the use of hyphenated techniques argue for close physical
proximity to facilitate acquisition and control.
Low speed devices such as chromatographs would permit longer separations particularly if
care was taken for noise-rejection in cabling.
Since the computer provides data storage and communications buffering, the proximity of
the computer to the centralized data system isn't a problem as long as provision for
fault tolerance to the loss of a network connection is built in.
This becomes more of an issue as the database computer becomes more physically distant from
the instrument system due to the potential for delays and downtime.
The loss of a connection will impact information transfer in both directions; test results
in one direction, work lists in the other.
Some common laboratory devices are really packages of a measuring instrument and computer
control system.
The instrument provides the analog signal and the computer converts it into digital
form and provides serial or Ethernet communications protocols to computer system.
pH meters and balances, are among the devices that fall into this type of instrumentation.
These devices are usually designed to work in two possible modes: front panel controls
operated by a person, and programmable modes that depend on instructions from computers.
Front panel operations are controlled by the analysts needs; take a measurement, transmit
it, etc.
The back-panel command structure is a simple command-and-reply sequence.
For example a balance may be told to record a weight and send it back to a computer.
This requires an active connection, if there is a delay or either the instrument or computer
goes off-line, nothing happens.
LIMS, ELNs, and laboratory execution systems usually have facilities to connect these devices
to a computer and controlling them.
A Laboratory Execution System for example,
may interact with a balance by instructing the analyst to place something on the balance
pan, press a button and then the software records the weight in its data set, ready
to be used in the step of the process.
LIMS and ELNs have similar functions.
If the connection between the software system and device drops out, nothing will happen
and the analyst will have to revert to manual front-panel operations, entering the information
into the database later.
When we talk about connecting instrument to centralized database systems, we aren't
talking about analog or digital interfaces.
What we are looking at is communications between computer systems, exchanging files or serial
data.
Even in those cases, serial communications is best done with the use of a local (to the
lab) intermediate computer.
Connecting instruments, really means connecting the computer systems that are attached to
instruments and transferring files, or if data exchange standards are in place, exchanging
messages.
The problem with serial data is twofold: first there is the lack of an error free communications
protocol with error detection and correction, and, second, considering the previous examples,
there is the possibility of delays in transmission resulting in problems carrying out lab tasks.
It is easier to let a local computer handle the instrument responses and package the resulting
information in a file transmitted over networks.
Fundamentally, time critical, fast response tasks should be serviced by local-to-the-lab
computers.
This is part of the planning needed in laying out lab networks.
When files are transmitted by an instrument workstation to a LIMS, ELN, LES or SDMS, they
are received and analyzed to extract the necessary information.
The information is then entered into the database system.
The words "interfacing" and "communications" are not synonyms.
They represent different technologies, and we have to be careful how they are applied
to laboratory work.
It's important to understand the distinction between the "instrument" and the "data
system".
We don't normally interface instruments to LIMS/ELNs.
The interfacing is done through a computer system that communicates to the LIMS/ELNs.
We'll go into instrument interfacing and data systems in more detail in a later session,
it is very important to the design of your labs technology.
The need for a distinction between "interfacing" and "communications" will begin to become
clear in the next few slides.
This is a simple situation: one lab, one system.
With this arrangement the lab has complete flexibility in connecting instrumentation
to the LIMS or ELN, as well as configuring the database to meet their needs.
If specialized programming is needed to support an instrument, the lab has the freedom to
do so.
All of the instrument types in the previous slides could be supported.
One consideration in particular is worth noting: commercial data systems have a library of
software available to support instrumentation and make instrument-computer connections easier.
What happens if your device isn't supported.
If it is a low cost device, it might be far easier and less costly to replace with one
that is.
If not, find something similar and modify the software.
Otherwise you have an add-on project.
Instrument support requirements should be part of the user requirements.
Software modifications are an IT issue.
Until the advent of high-speed networks, this was the typical laboratory system configuration.
IT would be responsible for hardware support, operating systems, and infrastructure, as
well as system backup.
The support for the lab application software might come from corporate IT, but might also
be a lab function or contracted to a 3rd party.
The multiple lab variation is just a repeat configuration for each lab.
With this arrangement the labs retain complete flexibility in connecting instrumentation
to the LIMS or ELN, as well as configuring the database to meet their needs.
If specialized programming is needed to support an instrument, the lab has the freedom to
do so.
If different labs have similar interconnection needs, the development work done for one lab
can be repeated elsewhere.
This is one clear benefit for standardizing on product sets as long as it doesn't compromise
the labs work.
This situation could be replicated for a number of labs based on the same products, each lab
having full independence.
IT support would have to manage multiple computer systems dealing with support, updates, and
so on.
Each system would be individually backed up and the data system archived, as noted earlier
this could be done automatically.
This puts a considerable burden on IT staff, which could be mitigated by automated backup
procedures.
Backup and archiving facilities would have to be incorporated into the user requirements
along with the policies for backup frequency and archiving of backups.
There would also have to be periodic testing of the backups and archives to make sure the
process is working.
The next variation provides support benefits, but starts putting some complexity into the
system.
In this arrangement we have 3 independent labs sharing access to a common LIMS installation
(could be an ELN, or a multi-functional system) with each lab having its own copy of a database;
note: not all vendors support this configuration.
This could be considered as a variation of the software-as-a-service model, with the
software hosted on a private corporate server instead of the vendors.
Another variation is the use of virtualization.
Virtualization is a software technology that allows one or more servers to host multiple
copies of software running independently.
There are some benefits and limitations that are imposed on the lab.
Note: the SDMS is configured to be local-to-the-lab in order to facilitate tasks that need fast
responses.
There are multiple configurations possible with these systems and you have evaluate your
needs to determine the one most appropriate for your work.
From the labs standpoint they have access to fully functional centralized database systems:
LIMS or ELNs for example.
Laboratory data is concentrated in the SDMS with necessary information passed through
to the central database system.
The SDMS acts a buffer or primary storage for laboratory instrument data.
The centralized data system would not be modified to meet individual labs needs.
This is necessary to ensure that the data systems are easily upgraded and supported
without having to re-implement custom modifications.
Those modifications would be implemented in the SDMS which is supported by each laboratory's
independent configuration.
The major benefits are in the cost of the system when compared to purchasing and installing
multiple independent systems, the cost reduction for support and maintenance,
making it easier to add
additional labs, and the ease of providing backup and archiving.
Security and physical access control is provided by IT staff.
One popular technology application is moving software and data systems to the cloud.
From a network topology standpoint, this configuration and that on the previous slide are pretty
much the same.
The list of benefits from cloud implementations, particularly those that are vendor supported,
are considerable.
However the issues do need attention.
One of the obvious points is that the server for the database system isn't anywhere you are
likely to be able to visit.
And that raises some issue of it own.
We are used having the web take us virtually anywhere in the world with a few clicks, and
having the results pop up quickly.
We can enter information into systems without having any idea where they are, and expect
good response times.
Networks are fast enough that we are able to distinguish between the response times
for areas with well developed networks and those with less sophisticated systems.
Right now we are experiencing an example of the networks capabilities; I'm in Massachusetts,
our producer is in Michigan, and you are all over the place.
The distance between client and server systems can be measured in two ways: the time it takes
to send/receive data/information, and, the physical separation between them.
In most cases the physical separation doesn't matter, but the point-to-point physical span
can have significant impact on performance when life is less than ideal.
When we are in a planning process we have to work against potential problems, including
the security of your intellectual property.
When the servers move off-campus distance concerns increase, and new ones are added:
Those are security, downtime, delays, and legal issues.
Geographically distributed networks raise some issues.
The most obvious are delays and interruptions, those 404, file not found errors.
Or the "email delayed" messages.
They aren't frequent, but they happen.
Planning for them is essential.
Even situations like the one we are experiencing now can have problems that need to be planned
for, and I hope I didn't just jinks it.
For example we have rehearsal recordings that we can use if something prevents a live presentation.
The problem for your lab is: "what happens if your connection to your LIMS or ELN is
lost?"
What are the vendors backup plans?
Do they have redundant systems that you can switch to if there is a problem?
How often are they synchronized?
Do you know how to access them?
Have you tested the process?
The further away you are from the server the more room they have for problems to occur.
Among the causes for problems are power outages, storms like the large hurricanes experienced
in the US and elsewhere, and the potential for earthquakes.
You may feel that you are in a safe area, but how about your vendor's servers and
the space in between?
These aren't reasons to avoid using the technologies available, but areas where planning
has to be done, including testing, running the equivalent of fire drills to make sure
that plans actually work.
This is a subject where your IT groups experience, in conjunction with your vendors come into
play.
The time to make these plans is before implementation has begun, in fact, before you sign anything.
Another concern is security.
The networks we rely upon are global in extent.
So are the people who would like to have access to your data and information.
Are your corporate and remote systems protected against attacks that can include unauthorized
access, malware, denial of services, ransom ware, and other unpleasant issues.
The typical SAAS vendor will say that their security is better than your corporate systems.
This is another area where your IT group can work with your prospective vendors to ensure
functioning and safe systems.
The last point I want to make on this subject is legal issues.
When you are using software hosted on 3rd party platforms you become subject to a number
of legal issues that may originate in your location, country, or the country where your
servers are located.
For example, you may be working with a Software-as-a-Service vendor
for a database, and they host it on
a sub-contractors server farm that has locations in your country or another.
Each of these can contribute to legal concerns.
And you may not have done anything wrong.
Vendors may offer the option of hosting your database and software on either a private
server or a shared system.
If you are on a shared system, and someone else does something bad, the entire server
may be impounded, including your data.
The laws on this subject vary widely by country, and it is an evolving issue.
This is one where both IT and your legal department may need to be included before finalizing
the relationship with the vendor.
We began this presentation looking at the options for lab informatics, with all systems
residing within the labs walls, and then considered the possibility of reducing costs by several
labs using the same products. That depends upon individual labs needs analysis
converging down to compatible solutions.
That being the case, we then looked at the ramifications of consolidating systems into
shared on-site servers and then having those databases hosted by off-site 3rd party vendors,
the Software-as-a-Service model.
We also looked at how instrument interfacing, data systems, and communication affected the
distribution of informatics inside the lab and working with remote systems.
In the previous webinar, we made a statement that even startup labs need to focus on the
centralized database as one of their initial priorities, determining if they need a LIMS
or ELN for example to support their labs information capture and analysis.
Vendor supported Software-as-a-Service models may be a good way to implement those tools
while minimizing the financial investment.
In all of this, we made frequent reference to the role of IT support.
Now it is time to begin looking at that subject in more detail.
We first raised the Information Technology Support issue in webinar 3, when we discussed
needs development.
Now as we start looking at implementation and support options, particularly when the
central database system might migrate from the lab to on-campus hosting and then to off-site,
possibly by 3rd party support, the need for coordination with IT becomes more important.
There are shared responsibilities between lab personnel and IT support for the successful
use of informatics systems in the lab.
As we've discussed the lab user community is responsible for determining what is needed
and how it should function.
They have to be intimately involved in product specification and selection, as well as evaluating
the implementation options and the final system.
This stuff directly impacts their ability to work, and, the validity of the lab processes
being executed.
So what exactly should the role of IT be?
They aren't just people who support hardware and software, they need to be advisors on
software issues and in particular support.
That includes their ability to support users, and, their evaluation of vendor's ability
to support products and ensure that they are working properly.
This world is a lot more complex than office products, and people need to have the right
balance of skills to be successful, your laboratory depends on it.
We need to consider the role of corporate IT, and the possible addition of a LAB-IT
function.
In most organizations, corporate IT is responsible for hardware support, operating systems, and
frequently used applications software, office applications for example as well as corporate
database systems, and in many companies enterprise resource planning systems, and help desk support.
Enterprise resource planning includes a number of functions that run the businesses including
customer service, human resources, accounting, production, sales, and so on.
These are large, expensive system that can encompass an entire business operation.
We'll get to how that can impact your lab later, probably in the next webinar.
While the capabilities provided by corporate IT are important to the labs operation, fully
supporting laboratories is usually a bit outside their experience.
The items on the left are what companies typically see as a description of IT support.
Organizations that support manufacturing and scientific work may be able to justify personnel
that are specialized in those fields.
In some cases you may have people holding advanced scientific or engineering degrees
providing an IT support function.
The items on the right are more typical of laboratory systems.
The first bullet for example might be at odds with corporate policies of upgrading operating
systems; in a lab an upgrade could be a disaster.
There are a large number of lab applications found in any facility, which is a significant
contrast to most business operations.
Vendors can skimp on documentation, they are more interested in the technical aspect of
their products than effective user documentation and support.
The impact of system problems can be serious.
Problems can easily snowball as work gets backed up.
And finally it is really hard to find people with the skills needed to support lab work.
It isn't just technical know-how, it's people skills, trouble-shooting, and problems
solving.
With a wee bit of pressure added to keep it interesting.
What we need to do is to develop professionals with a balance of skills to meet the increasingly
complex demands of technology management in scientific and laboratory work.
At one time that might have meant understanding LIMS or ELNs, or robotics, or instrument interfacing.
Now it is a complex and specialized mix of talents needed to bridge science and technology.
The work may expand to include modeling, simulation, and the ability to handle Big-Data computing
applications and analysis.
Beyond that, these people will need to be able to understand the needs of scientists
and develop the tools to meet their labs needs.
This is where session five will begin.
And that is currently scheduled for April 26th.
During this session we've covered a lot of ground including the ability for multiple
labs to use common products, and the options for various levels of managing central databases
on the corporate and cloud levels.
Vendor hosted cloud systems could provide an easy entry point for small lab into informatics
technologies.
Care should be taken to ensure that they meet your requirements, and that they provide a
migration path to corporate on-premise hosting should that be desirable.
There will be a slide following this one in the handouts (PDF) you will have access to as part of
the webinar series.
It will have a list of additional references that will be useful for the labs work.
That's the end of the formal presentation.
Are there any questions that people would like to address?
Thank you Joe!
I'd like to remind everyone that you can submit your
questions or comments using the chat box on the right-hand side of your screen.
So feel free to go ahead and do that.
While we wait for any questions or comments to come through, we do have a few questions
for Joe that came in that he'd like to address.
So, we'll go ahead and address those questions
and we'll give you a few moments if you do want to go ahead and submit any of your questions or comments.
I also want to mention that we will be providing a recording and the
slides to everyone after the webinar today.
It will be posted on the LiMSforum.
I know we did get a few of those questions and comments in the chatbox earlier.
Ok, so Joe here's one of the first few questions here.
You mentioned that the first major component of lab informatics should be the central database system.
How do you protect yourself from making a mistake?
Well, the major approach to doing that is making sure you've done a thorough job of specifying the system.
Really figure out what it is you need to get done, what you want the system to accomplish
and what model, whether it be the LIMS model or ELN model, fits what your workflow is.
You need to talk to people who are knowledgeable in the field
and look for systems that provide for flexibility.
There are a number of them and the number is increasing
that support both LIMS and ELN functions,
so it might be easier to transition between the two of them.
But the real one, the real kicker,
is making sure you do a really good analysis and evaluation of what it is you need to get accomplished
and how your lab wants to work.
Ok great! And here is another question that we received.
What if IT support people are pushing for a system based on their ERP system? Is this a good idea?
Usually not, we'll be going into some more detail in webinar 5.
We look at more about the details of IT support and start looking at some build or buy solutions.
IT people that are using enterprise resource planning systems
have invested a lot of effort, a lot of time, and a lot of money, and a lot of training
to learn how to use those systems.
And they may have sold them on the idea that these systems are gonna do everything
that the company needs to have done.
And then the lab people raise their hand...
and there's a lot of things in labs that they probably haven't considered.
They may try to build the system around an ERP
but that's usually not a good idea...some people have done it...
but the builder or buy solution, the build or buy problem is a significant one.
It really needs a lot of thought and a lot of work.
Okay, interesting, great!
There's one last question that we received in advance here
and we'll see if any additional questions come through the chat box.
How realistic is it to want to connect instruments to a remote database system from the lab?
All right, you remember there is a distinction between the instrument which provides an analog output
and the computer system.
And in many cases when a salesman sells you an instrument,
they automatically sell you the computer as if they're the same thing.
You don't connect instruments the analog output to remote computers...it simply doesn't work.
What you really do is communicate the database structure.
File sharing, for example, between the computer that's attached to the instrument to a remote computer.
So often, when you talk to a Salesman, you'll say I want to reconnect this instrument to
my LIMS system or a remote ELN.
What you're really doing is a computer to computer connection, not the instrument itself
as part of the instrument package.
Okay great, thank you for answering those questions.
It does look like we have a couple questions that came in through the chat window as well.
This first one here... it looks like, it seems like the question is asking:
Can you provide criteria to consider for remote server hosting?
What do you mean by criteria?
Basically, when you're dealing with a remote server you're really dealing with
something akin to a time sharing...
...a time sharing operation
where you're sending data to an instrument, you're sending data to a remote database
and getting information back.
It's a lot like working with a web browser.
So among the things you want to be concerned about are response times, response speed,
how the data is being controlled, uptime on the servers, that kind of thing.
Basically ensure that when the server is there, when you need the server to be there, it'll be there working.
Not quite sure what it is you're trying to get at.
Okay yeah, if there's some more clarification that's needed,
just let us know in the chat box, but hopefully that answered your question regarding hosting.
Yep, if it didn't on the bottom of the slides is my email address. Just send me a note and we can talk that way.
Okay great, that's wonderful that you've provided your contact information Joe so that if anyone wants to
speak with you one-on-one then you can certainly contact Joe and he's a great resource for that.
And it does look like that does answer Nathan's question there.
Another question that came through is:
What is the difference between a traditional SDMS and a LIMS that has a module that emulates similar
functionality, but is not considered a true SDMS?
An example is provided...
Ex. A lab where it doesn't have an SDMS, but says they have similar functionality through a module.
Well, it's a lot like saying you've got something that looks just like a duck but it's not quite a duck.
It's kind of hard to answer that question without knowing what your situation is.
SDMS's have been around for a while.
Waters has a number of them.
A matter of fact, Waters was originally provided by a third party and became part of their system.
The question really comes down to is, what do you need to have done?
An SDMS can be looked at as being a very large filing cabinet that can handle a lot of different stuff:
Reports, documents, images, instrument data... that's sorted by project, by instrument type, sample types...
you can sort things a number of different ways.
So the easiest way to answer that question, is say:
These are the functions I need. These are the facilities that I need. This is how I want my lab to work.
Does the SDMS meet those needs?
And then, when you turn around to a LIMS vendor, for example, and look at what they provide...
...you can say fine, does your system meet all these needs? Or is there a lot of hacking or a lot of software
development that has to be done to get things working?
An SDMS provides a point of connection for a lot of instrumentation,
a place where a lot of data can get dumped that doesn't fit into a LIMS file structure.
And then the parts that do belong in the LIMS file structure can be extracted and moved forward.
So again, a lot depends on what problem you want to solve.
Okay, great. Thanks for answering that and of course if you do want to talk with Joe, one-on-one, his email
is available on the slides and we've provided it through our communication emails,
so you can dig there as well.
It does look like that did address their question Joe, so thank you.
Joe's email address again is joe.liscouski@gmail.com
It will be in the slides, so you can connect with him.
I don't see any other questions that are coming through at this point in time.
What we can do is we can start with our closing information and of course if any other questions
come through, feel free to go ahead and submit those. But it looks like we've covered all of those questions.
Joe is there anything else that you would want to cover before we go ahead and wrap up here today?
Just thank you for you being here and as I said the next session we're going to get into more
information about IT support; that tends to be a very important question.
We'll be beginning to look at the build or buy; and then future sessions will look at instrumentation,
instrument data systems, and moving down the line to sample preparation.
Okay great, thank you so much Joe! It doesn't look like there's anything else coming through,
so I believe that that will conclude our webinar session here today! Thank you so much!
And as a reminder we will be sending a follow-up to everyone registered with a link to the recording,
as well as the slides. You can locate Parts 1, 2, & 3 of this webinar series on the LiMSforum
and we'll include that in our follow-up email as well.
So we'll see you next time and thank you so much!
Không có nhận xét nào:
Đăng nhận xét