Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [ptp-dev] Thinking about rsync-before-make hybrid local/remote projects

> If we are having an IDE looking at remote files, is that violating
> the license terms?  We have to copy the file over to display it.  An
> IDE that will not let you look at any include files is pretty
> worthless.  But if it is ok to to that, why can't we copy them
> over?  Perhaps it is ok to keep them in memory but not ok to store
> them on disk.  That would just use more memory, not a big thing,
> since I doubt that the include files are bigger than (say) 100 MB.

I am pretty sure it's OK to keep a copy in memory. But the licenses often would say you have a license to use the compiler on one machine. Copying to the memory of another machine is sort of a grey area I guess, but making a physical copy is probably a no-no. I'm not a lawyer, and obviously it depends on the software and licensing in question.

In RDT we currently do not cache any files, as we are using EFS without caching, so we don't have this problem. If you open files with the Remote Systems Explorer directly via the Remote Systems View, then it will cache the file locally. Obviously that caching can be problematic if licensing and/or security issues are involved, but at least there is a path where if you don't stray from it, you don't have a problem.

> The same with security.  If you allow workstations to access the
> files to edit them, what is going to keep the workstation from
> caching the files?  That is all we are doing.  Whether we store the
> cache on the disk or in memory is a less important detail.


Of course what you say is true. But we're not talking about some arbitrary software we don't know about. We are talking about software we are writing, and thus we can control what it does. Right now, with RDT, the files are only accessed over the network, which can be tunneled securely. Short of some Trojan reading the file's bits right out of memory, there isn't any lasting fingerprint of the file on the system, and that has some security advantages. Is it iron clad security? No, of course not. The very act of remote development introduces some level of insecurity. The communication's channel could be cracked and listened to, people in a van outside might use Van Eck phreaking to read your screen, etc. But, it's more secure than leaving copies of the files on someone's laptop, so that has some appeal to some people.

Anyway, I'm not saying that the reasons I gave will necessarily be reasons that you decide not to follow the path you are proposing. I'm just saying there are some things you need to be aware of and consider.

===========================
Chris Recoskie
Team Lead, IBM CDT and RDT
IBM Toronto

Inactive hide details for Ralph Johnson ---12/15/2009 09:38:56 AM---On Tue, Dec 15, 2009 at 7:39 AM, Chris Recoskie <recoskie@cRalph Johnson ---12/15/2009 09:38:56 AM---On Tue, Dec 15, 2009 at 7:39 AM, Chris Recoskie <recoskie@xxxxxxxxxx> wrote: > Hi Jeff,


From:

Ralph Johnson <johnson@xxxxxxxxxxx>

To:

Parallel Tools Platform general developers <ptp-dev@xxxxxxxxxxx>

Cc:

overbey2@xxxxxxxxxxxx

Date:

12/15/2009 09:38 AM

Subject:

Re: [ptp-dev] Thinking about rsync-before-make hybrid local/remote projects

Sent by:

ptp-dev-bounces@xxxxxxxxxxx






On Tue, Dec 15, 2009 at 7:39 AM, Chris Recoskie <recoskie@xxxxxxxxxx> wrote:
    Hi Jeff,



    > One option is to copy header files from the include paths on the remote
    > machine into the local project (or rsync them on every build).  So,
    > e.g., /MyEclipseProject/.remote-includes/mpi.h would be a local copy of
    > mpi.h from the remote machine.  Then the we would just add
    > "/MyEclipseProject/.remote-includes" to the list of include paths, and
    > features like Open Declaration and Search would more or less work
    > correctly.  (Maybe?)


    It's possible but I wouldn't recommend it, for a few reasons.

    - All the time it's going to take to copy those headers down.
    - Are you even permitted by the license terms of the OS or whatever library you are using to copy them to another machine?  You may only be licensed to use them in one place.
    - Are you permitted by your organization's security policy to copy said headers to another machine?  Imagine you work for some three letter agency in the government and are using some super secret code.  You probably can't.
    - Keeping the headers in sync with the canonical copy on the remote machine is going to be a pain.
    - Include statements on the local machine might use absolute paths.  This would either mean putting the headers in the exact same place on the local machine so that #including with the same path will work (which might kind of work if you are going from one UNIX-like OS to another, but definitely not when going from UNIX-like to Windows).  This is also assuming you don't have a similarly named header file on the local machine that you don't want to stomp upon.  You could maybe create a new preprocessor that supported some notion of mapping include paths to new paths, but that's a lot of work.


The licensing issue seems to me like the biggest one.  Copying the headers only happens once, then it is a matter of synching.  Synching is a pain, but it is a pain we know.  We will not store remote headers in the same position on the local machine, there will be a separate .remote-includes file for them.  We are already planning to change all the preprocessors, and we don't think that is hard.  But I don't know what to do about the licensing issue.

If we are having an IDE looking at remote files, is that violating the license terms?  We have to copy the file over to display it.  An IDE that will not let you look at any include files is pretty worthless.  But if it is ok to to that, why can't we copy them over?  Perhaps it is ok to keep them in memory but not ok to store them on disk.  That would just use more memory, not a big thing, since I doubt that the include files are bigger than (say) 100 MB.

The same with security.  If you allow workstations to access the files to edit them, what is going to keep the workstation from caching the files?  That is all we are doing.  Whether we store the cache on the disk or in memory is a less important detail.

-Ralph Johnson
_______________________________________________
ptp-dev mailing list
ptp-dev@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/ptp-dev


GIF image

GIF image


Back to the top