/brz/remove-bazaar

To get this branch, use:
bzr branch http://gegoxaren.bato24.eu/bzr/brz/remove-bazaar
2052.3.2 by John Arbash Meinel
Change Copyright .. by Canonical to Copyright ... Canonical
1
# Copyright (C) 2005, 2006 Canonical Ltd
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
2
#
3
# Authors:
4
#   Johan Rydberg <jrydberg@gnu.org>
5
#
6
# This program is free software; you can redistribute it and/or modify
7
# it under the terms of the GNU General Public License as published by
8
# the Free Software Foundation; either version 2 of the License, or
9
# (at your option) any later version.
1887.1.1 by Adeodato Simó
Do not separate paragraphs in the copyright statement with blank lines,
10
#
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
11
# This program is distributed in the hope that it will be useful,
12
# but WITHOUT ANY WARRANTY; without even the implied warranty of
13
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
14
# GNU General Public License for more details.
1887.1.1 by Adeodato Simó
Do not separate paragraphs in the copyright statement with blank lines,
15
#
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
16
# You should have received a copy of the GNU General Public License
17
# along with this program; if not, write to the Free Software
18
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA
19
20
"""Versioned text file storage api."""
21
3350.6.1 by Robert Collins
* New ``versionedfile.KeyMapper`` interface to abstract out the access to
22
from cStringIO import StringIO
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
23
import os
3350.6.1 by Robert Collins
* New ``versionedfile.KeyMapper`` interface to abstract out the access to
24
import urllib
25
from zlib import adler32
26
1996.3.7 by John Arbash Meinel
lazy import versionedfile, late-load bzrlib.merge
27
from bzrlib.lazy_import import lazy_import
28
lazy_import(globals(), """
29
30
from bzrlib import (
31
    errors,
2249.5.12 by John Arbash Meinel
Change the APIs for VersionedFile, Store, and some of Repository into utf-8
32
    osutils,
2520.4.3 by Aaron Bentley
Implement plain strategy for extracting and installing multiparent diffs
33
    multiparent,
1996.3.7 by John Arbash Meinel
lazy import versionedfile, late-load bzrlib.merge
34
    tsort,
2229.2.1 by Aaron Bentley
Reject reserved ids in versiondfile, tree, branch and repository
35
    revision,
1996.3.7 by John Arbash Meinel
lazy import versionedfile, late-load bzrlib.merge
36
    ui,
37
    )
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
38
from bzrlib.graph import DictParentsProvider, Graph, _StackedParentsProvider
1996.3.7 by John Arbash Meinel
lazy import versionedfile, late-load bzrlib.merge
39
from bzrlib.transport.memory import MemoryTransport
40
""")
1563.2.12 by Robert Collins
Checkpointing: created InterObject to factor out common inter object worker code, added InterVersionedFile and tests to allow making join work between any versionedfile.
41
from bzrlib.inter import InterObject
3350.3.7 by Robert Collins
Create a registry of versioned file record adapters.
42
from bzrlib.registry import Registry
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
43
from bzrlib.symbol_versioning import *
1551.6.7 by Aaron Bentley
Implemented two-way merge, refactored weave merge
44
from bzrlib.textmerge import TextMerge
1563.2.11 by Robert Collins
Consolidate reweave and join as we have no separate usage, make reweave tests apply to all versionedfile implementations and deprecate the old reweave apis.
45
46
3350.3.7 by Robert Collins
Create a registry of versioned file record adapters.
47
adapter_registry = Registry()
48
adapter_registry.register_lazy(('knit-delta-gz', 'fulltext'), 'bzrlib.knit',
49
    'DeltaPlainToFullText')
50
adapter_registry.register_lazy(('knit-ft-gz', 'fulltext'), 'bzrlib.knit',
51
    'FTPlainToFullText')
52
adapter_registry.register_lazy(('knit-annotated-delta-gz', 'knit-delta-gz'),
53
    'bzrlib.knit', 'DeltaAnnotatedToUnannotated')
54
adapter_registry.register_lazy(('knit-annotated-delta-gz', 'fulltext'),
55
    'bzrlib.knit', 'DeltaAnnotatedToFullText')
56
adapter_registry.register_lazy(('knit-annotated-ft-gz', 'knit-ft-gz'),
57
    'bzrlib.knit', 'FTAnnotatedToUnannotated')
58
adapter_registry.register_lazy(('knit-annotated-ft-gz', 'fulltext'),
59
    'bzrlib.knit', 'FTAnnotatedToFullText')
60
61
3350.3.3 by Robert Collins
Functional get_record_stream interface tests covering full interface.
62
class ContentFactory(object):
63
    """Abstract interface for insertion and retrieval from a VersionedFile.
64
    
65
    :ivar sha1: None, or the sha1 of the content fulltext.
66
    :ivar storage_kind: The native storage kind of this factory. One of
67
        'mpdiff', 'knit-annotated-ft', 'knit-annotated-delta', 'knit-ft',
68
        'knit-delta', 'fulltext', 'knit-annotated-ft-gz',
69
        'knit-annotated-delta-gz', 'knit-ft-gz', 'knit-delta-gz'.
70
    :ivar key: The key of this content. Each key is a tuple with a single
71
        string in it.
72
    :ivar parents: A tuple of parent keys for self.key. If the object has
73
        no parent information, None (as opposed to () for an empty list of
74
        parents).
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
75
    """
3350.3.3 by Robert Collins
Functional get_record_stream interface tests covering full interface.
76
77
    def __init__(self):
78
        """Create a ContentFactory."""
79
        self.sha1 = None
80
        self.storage_kind = None
81
        self.key = None
82
        self.parents = None
83
84
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
85
class FulltextContentFactory(ContentFactory):
86
    """Static data content factory.
87
88
    This takes a fulltext when created and just returns that during
89
    get_bytes_as('fulltext').
90
    
91
    :ivar sha1: None, or the sha1 of the content fulltext.
92
    :ivar storage_kind: The native storage kind of this factory. Always
93
        'fulltext'.
94
    :ivar key: The key of this content. Each key is a tuple with a single
95
        string in it.
96
    :ivar parents: A tuple of parent keys for self.key. If the object has
97
        no parent information, None (as opposed to () for an empty list of
98
        parents).
99
     """
100
101
    def __init__(self, key, parents, sha1, text):
102
        """Create a ContentFactory."""
103
        self.sha1 = sha1
104
        self.storage_kind = 'fulltext'
105
        self.key = key
106
        self.parents = parents
107
        self._text = text
108
109
    def get_bytes_as(self, storage_kind):
110
        if storage_kind == self.storage_kind:
111
            return self._text
112
        raise errors.UnavailableRepresentation(self.key, storage_kind,
113
            self.storage_kind)
114
115
116
class AbsentContentFactory(ContentFactory):
3350.3.12 by Robert Collins
Generate streams with absent records.
117
    """A placeholder content factory for unavailable texts.
118
    
119
    :ivar sha1: None.
120
    :ivar storage_kind: 'absent'.
121
    :ivar key: The key of this content. Each key is a tuple with a single
122
        string in it.
123
    :ivar parents: None.
124
    """
125
126
    def __init__(self, key):
127
        """Create a ContentFactory."""
128
        self.sha1 = None
129
        self.storage_kind = 'absent'
130
        self.key = key
131
        self.parents = None
132
133
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
134
class AdapterFactory(ContentFactory):
135
    """A content factory to adapt between key prefix's."""
136
137
    def __init__(self, key, parents, adapted):
138
        """Create an adapter factory instance."""
139
        self.key = key
140
        self.parents = parents
141
        self._adapted = adapted
142
143
    def __getattr__(self, attr):
144
        """Return a member from the adapted object."""
145
        if attr in ('key', 'parents'):
146
            return self.__dict__[attr]
147
        else:
148
            return getattr(self._adapted, attr)
149
150
3350.3.14 by Robert Collins
Deprecate VersionedFile.join.
151
def filter_absent(record_stream):
152
    """Adapt a record stream to remove absent records."""
153
    for record in record_stream:
154
        if record.storage_kind != 'absent':
155
            yield record
156
157
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
158
class VersionedFile(object):
159
    """Versioned text file storage.
160
    
161
    A versioned file manages versions of line-based text files,
162
    keeping track of the originating version for each line.
163
164
    To clients the "lines" of the file are represented as a list of
165
    strings. These strings will typically have terminal newline
166
    characters, but this is not required.  In particular files commonly
167
    do not have a newline at the end of the file.
168
169
    Texts are identified by a version-id string.
170
    """
171
2229.2.1 by Aaron Bentley
Reject reserved ids in versiondfile, tree, branch and repository
172
    @staticmethod
2229.2.3 by Aaron Bentley
change reserved_id to is_reserved_id, add check_not_reserved for DRY
173
    def check_not_reserved_id(version_id):
174
        revision.check_not_reserved_id(version_id)
2229.2.1 by Aaron Bentley
Reject reserved ids in versiondfile, tree, branch and repository
175
1563.2.15 by Robert Collins
remove the weavestore assumptions about the number and nature of files it manages.
176
    def copy_to(self, name, transport):
177
        """Copy this versioned file to name on transport."""
178
        raise NotImplementedError(self.copy_to)
1863.1.1 by John Arbash Meinel
Allow Versioned files to do caching if explicitly asked, and implement for Knit
179
3350.3.3 by Robert Collins
Functional get_record_stream interface tests covering full interface.
180
    def get_record_stream(self, versions, ordering, include_delta_closure):
181
        """Get a stream of records for versions.
182
183
        :param versions: The versions to include. Each version is a tuple
184
            (version,).
185
        :param ordering: Either 'unordered' or 'topological'. A topologically
186
            sorted stream has compression parents strictly before their
187
            children.
188
        :param include_delta_closure: If True then the closure across any
3350.3.22 by Robert Collins
Review feedback.
189
            compression parents will be included (in the data content of the
190
            stream, not in the emitted records). This guarantees that
191
            'fulltext' can be used successfully on every record.
3350.3.3 by Robert Collins
Functional get_record_stream interface tests covering full interface.
192
        :return: An iterator of ContentFactory objects, each of which is only
193
            valid until the iterator is advanced.
194
        """
195
        raise NotImplementedError(self.get_record_stream)
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
196
197
    def has_version(self, version_id):
198
        """Returns whether version is present."""
199
        raise NotImplementedError(self.has_version)
200
3350.3.8 by Robert Collins
Basic stream insertion, no fast path yet for knit to knit.
201
    def insert_record_stream(self, stream):
202
        """Insert a record stream into this versioned file.
203
204
        :param stream: A stream of records to insert. 
205
        :return: None
206
        :seealso VersionedFile.get_record_stream:
207
        """
208
        raise NotImplementedError
209
2520.4.140 by Aaron Bentley
Use matching blocks from mpdiff for knit delta creation
210
    def add_lines(self, version_id, parents, lines, parent_texts=None,
2805.6.7 by Robert Collins
Review feedback.
211
        left_matching_blocks=None, nostore_sha=None, random_id=False,
212
        check_content=True):
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
213
        """Add a single text on top of the versioned file.
214
215
        Must raise RevisionAlreadyPresent if the new version is
216
        already present in file history.
217
218
        Must raise RevisionNotPresent if any of the given parents are
1594.2.21 by Robert Collins
Teach versioned files to prevent mutation after finishing.
219
        not present in file history.
2805.6.3 by Robert Collins
* The ``VersionedFile`` interface no longer protects against misuse when
220
221
        :param lines: A list of lines. Each line must be a bytestring. And all
222
            of them except the last must be terminated with \n and contain no
223
            other \n's. The last line may either contain no \n's or a single
224
            terminated \n. If the lines list does meet this constraint the add
225
            routine may error or may succeed - but you will be unable to read
226
            the data back accurately. (Checking the lines have been split
2805.6.7 by Robert Collins
Review feedback.
227
            correctly is expensive and extremely unlikely to catch bugs so it
228
            is not done at runtime unless check_content is True.)
1596.2.32 by Robert Collins
Reduce re-extraction of texts during weave to knit joins by providing a memoisation facility.
229
        :param parent_texts: An optional dictionary containing the opaque 
2805.6.3 by Robert Collins
* The ``VersionedFile`` interface no longer protects against misuse when
230
            representations of some or all of the parents of version_id to
231
            allow delta optimisations.  VERY IMPORTANT: the texts must be those
232
            returned by add_lines or data corruption can be caused.
2520.4.148 by Aaron Bentley
Updates from review
233
        :param left_matching_blocks: a hint about which areas are common
234
            between the text and its left-hand-parent.  The format is
235
            the SequenceMatcher.get_matching_blocks format.
2794.1.1 by Robert Collins
Allow knits to be instructed not to add a text based on a sha, for commit.
236
        :param nostore_sha: Raise ExistingContent and do not add the lines to
237
            the versioned file if the digest of the lines matches this.
2805.6.4 by Robert Collins
Don't check for existing versions when adding texts with random revision ids.
238
        :param random_id: If True a random id has been selected rather than
239
            an id determined by some deterministic process such as a converter
240
            from a foreign VCS. When True the backend may choose not to check
241
            for uniqueness of the resulting key within the versioned file, so
242
            this should only be done when the result is expected to be unique
243
            anyway.
2805.6.7 by Robert Collins
Review feedback.
244
        :param check_content: If True, the lines supplied are verified to be
245
            bytestrings that are correctly formed lines.
2776.1.1 by Robert Collins
* The ``add_lines`` methods on ``VersionedFile`` implementations has changed
246
        :return: The text sha1, the number of bytes in the text, and an opaque
247
                 representation of the inserted version which can be provided
248
                 back to future add_lines calls in the parent_texts dictionary.
1594.2.21 by Robert Collins
Teach versioned files to prevent mutation after finishing.
249
        """
1594.2.23 by Robert Collins
Test versioned file storage handling of clean/dirty status for accessed versioned files.
250
        self._check_write_ok()
2520.4.140 by Aaron Bentley
Use matching blocks from mpdiff for knit delta creation
251
        return self._add_lines(version_id, parents, lines, parent_texts,
2805.6.7 by Robert Collins
Review feedback.
252
            left_matching_blocks, nostore_sha, random_id, check_content)
1594.2.21 by Robert Collins
Teach versioned files to prevent mutation after finishing.
253
2520.4.140 by Aaron Bentley
Use matching blocks from mpdiff for knit delta creation
254
    def _add_lines(self, version_id, parents, lines, parent_texts,
2805.6.7 by Robert Collins
Review feedback.
255
        left_matching_blocks, nostore_sha, random_id, check_content):
1594.2.21 by Robert Collins
Teach versioned files to prevent mutation after finishing.
256
        """Helper to do the class specific add_lines."""
1563.2.4 by Robert Collins
First cut at including the knit implementation of versioned_file.
257
        raise NotImplementedError(self.add_lines)
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
258
1596.2.32 by Robert Collins
Reduce re-extraction of texts during weave to knit joins by providing a memoisation facility.
259
    def add_lines_with_ghosts(self, version_id, parents, lines,
2805.6.7 by Robert Collins
Review feedback.
260
        parent_texts=None, nostore_sha=None, random_id=False,
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
261
        check_content=True, left_matching_blocks=None):
1596.2.32 by Robert Collins
Reduce re-extraction of texts during weave to knit joins by providing a memoisation facility.
262
        """Add lines to the versioned file, allowing ghosts to be present.
263
        
2794.1.1 by Robert Collins
Allow knits to be instructed not to add a text based on a sha, for commit.
264
        This takes the same parameters as add_lines and returns the same.
1596.2.32 by Robert Collins
Reduce re-extraction of texts during weave to knit joins by providing a memoisation facility.
265
        """
1594.2.23 by Robert Collins
Test versioned file storage handling of clean/dirty status for accessed versioned files.
266
        self._check_write_ok()
1596.2.32 by Robert Collins
Reduce re-extraction of texts during weave to knit joins by providing a memoisation facility.
267
        return self._add_lines_with_ghosts(version_id, parents, lines,
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
268
            parent_texts, nostore_sha, random_id, check_content, left_matching_blocks)
1594.2.21 by Robert Collins
Teach versioned files to prevent mutation after finishing.
269
2794.1.1 by Robert Collins
Allow knits to be instructed not to add a text based on a sha, for commit.
270
    def _add_lines_with_ghosts(self, version_id, parents, lines, parent_texts,
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
271
        nostore_sha, random_id, check_content, left_matching_blocks):
1594.2.21 by Robert Collins
Teach versioned files to prevent mutation after finishing.
272
        """Helper to do class specific add_lines_with_ghosts."""
1594.2.8 by Robert Collins
add ghost aware apis to knits.
273
        raise NotImplementedError(self.add_lines_with_ghosts)
274
1563.2.19 by Robert Collins
stub out a check for knits.
275
    def check(self, progress_bar=None):
276
        """Check the versioned file for integrity."""
277
        raise NotImplementedError(self.check)
278
1666.1.6 by Robert Collins
Make knit the default format.
279
    def _check_lines_not_unicode(self, lines):
280
        """Check that lines being added to a versioned file are not unicode."""
281
        for line in lines:
282
            if line.__class__ is not str:
283
                raise errors.BzrBadParameterUnicode("lines")
284
285
    def _check_lines_are_lines(self, lines):
286
        """Check that the lines really are full lines without inline EOL."""
287
        for line in lines:
288
            if '\n' in line[:-1]:
289
                raise errors.BzrBadParameterContainsNewline("lines")
290
2535.3.1 by Andrew Bennetts
Add get_format_signature to VersionedFile
291
    def get_format_signature(self):
292
        """Get a text description of the data encoding in this file.
293
        
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
294
        :since: 0.90
2535.3.1 by Andrew Bennetts
Add get_format_signature to VersionedFile
295
        """
296
        raise NotImplementedError(self.get_format_signature)
297
2520.4.41 by Aaron Bentley
Accelerate mpdiff generation
298
    def make_mpdiffs(self, version_ids):
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
299
        """Create multiparent diffs for specified versions."""
2520.4.41 by Aaron Bentley
Accelerate mpdiff generation
300
        knit_versions = set()
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
301
        knit_versions.update(version_ids)
302
        parent_map = self.get_parent_map(version_ids)
2520.4.41 by Aaron Bentley
Accelerate mpdiff generation
303
        for version_id in version_ids:
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
304
            try:
305
                knit_versions.update(parent_map[version_id])
306
            except KeyError:
3453.3.1 by Daniel Fischer
Raise the right exception in make_mpdiffs (bug #235687)
307
                raise errors.RevisionNotPresent(version_id, self)
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
308
        # We need to filter out ghosts, because we can't diff against them.
309
        knit_versions = set(self.get_parent_map(knit_versions).keys())
2520.4.90 by Aaron Bentley
Handle \r terminated lines in Weaves properly
310
        lines = dict(zip(knit_versions,
311
            self._get_lf_split_line_list(knit_versions)))
2520.4.41 by Aaron Bentley
Accelerate mpdiff generation
312
        diffs = []
313
        for version_id in version_ids:
314
            target = lines[version_id]
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
315
            try:
316
                parents = [lines[p] for p in parent_map[version_id] if p in
317
                    knit_versions]
318
            except KeyError:
3453.3.2 by John Arbash Meinel
Add a test case for the first loop, unable to find a way to trigger the second loop
319
                # I don't know how this could ever trigger.
320
                # parent_map[version_id] was already triggered in the previous
321
                # for loop, and lines[p] has the 'if p in knit_versions' check,
322
                # so we again won't have a KeyError.
3453.3.1 by Daniel Fischer
Raise the right exception in make_mpdiffs (bug #235687)
323
                raise errors.RevisionNotPresent(version_id, self)
2520.4.48 by Aaron Bentley
Support getting blocks from knit deltas with no final EOL
324
            if len(parents) > 0:
325
                left_parent_blocks = self._extract_blocks(version_id,
326
                                                          parents[0], target)
327
            else:
328
                left_parent_blocks = None
2520.4.41 by Aaron Bentley
Accelerate mpdiff generation
329
            diffs.append(multiparent.MultiParent.from_lines(target, parents,
330
                         left_parent_blocks))
331
        return diffs
332
2520.4.48 by Aaron Bentley
Support getting blocks from knit deltas with no final EOL
333
    def _extract_blocks(self, version_id, source, target):
2520.4.41 by Aaron Bentley
Accelerate mpdiff generation
334
        return None
2520.4.3 by Aaron Bentley
Implement plain strategy for extracting and installing multiparent diffs
335
2520.4.61 by Aaron Bentley
Do bulk insertion of records
336
    def add_mpdiffs(self, records):
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
337
        """Add mpdiffs to this VersionedFile.
2520.4.126 by Aaron Bentley
Add more docs
338
339
        Records should be iterables of version, parents, expected_sha1,
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
340
        mpdiff. mpdiff should be a MultiParent instance.
2520.4.126 by Aaron Bentley
Add more docs
341
        """
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
342
        # Does this need to call self._check_write_ok()? (IanC 20070919)
2520.4.61 by Aaron Bentley
Do bulk insertion of records
343
        vf_parents = {}
2520.4.141 by Aaron Bentley
More batch operations adding mpdiffs
344
        mpvf = multiparent.MultiMemoryVersionedFile()
345
        versions = []
346
        for version, parent_ids, expected_sha1, mpdiff in records:
347
            versions.append(version)
348
            mpvf.add_diff(mpdiff, version, parent_ids)
349
        needed_parents = set()
2520.4.142 by Aaron Bentley
Clean up installation of inventory records
350
        for version, parent_ids, expected_sha1, mpdiff in records:
2520.4.141 by Aaron Bentley
More batch operations adding mpdiffs
351
            needed_parents.update(p for p in parent_ids
352
                                  if not mpvf.has_version(p))
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
353
        present_parents = set(self.get_parent_map(needed_parents).keys())
354
        for parent_id, lines in zip(present_parents,
355
                                 self._get_lf_split_line_list(present_parents)):
2520.4.141 by Aaron Bentley
More batch operations adding mpdiffs
356
            mpvf.add_version(lines, parent_id, [])
357
        for (version, parent_ids, expected_sha1, mpdiff), lines in\
358
            zip(records, mpvf.get_line_list(versions)):
359
            if len(parent_ids) == 1:
2520.4.140 by Aaron Bentley
Use matching blocks from mpdiff for knit delta creation
360
                left_matching_blocks = list(mpdiff.get_matching_blocks(0,
2520.4.141 by Aaron Bentley
More batch operations adding mpdiffs
361
                    mpvf.get_diff(parent_ids[0]).num_lines()))
2520.4.140 by Aaron Bentley
Use matching blocks from mpdiff for knit delta creation
362
            else:
363
                left_matching_blocks = None
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
364
            try:
365
                _, _, version_text = self.add_lines_with_ghosts(version,
366
                    parent_ids, lines, vf_parents,
367
                    left_matching_blocks=left_matching_blocks)
368
            except NotImplementedError:
369
                # The vf can't handle ghosts, so add lines normally, which will
370
                # (reasonably) fail if there are ghosts in the data.
371
                _, _, version_text = self.add_lines(version,
372
                    parent_ids, lines, vf_parents,
373
                    left_matching_blocks=left_matching_blocks)
2520.4.61 by Aaron Bentley
Do bulk insertion of records
374
            vf_parents[version] = version_text
2520.4.142 by Aaron Bentley
Clean up installation of inventory records
375
        for (version, parent_ids, expected_sha1, mpdiff), sha1 in\
376
             zip(records, self.get_sha1s(versions)):
377
            if expected_sha1 != sha1:
2520.4.71 by Aaron Bentley
Update test to accept VersionedFileInvalidChecksum instead of TestamentMismatch
378
                raise errors.VersionedFileInvalidChecksum(version)
2520.4.3 by Aaron Bentley
Implement plain strategy for extracting and installing multiparent diffs
379
2520.4.89 by Aaron Bentley
Add get_sha1s to weaves
380
    def get_sha1s(self, version_ids):
381
        """Get the stored sha1 sums for the given revisions.
382
383
        :param version_ids: The names of the versions to lookup
384
        :return: a list of sha1s in order according to the version_ids
385
        """
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
386
        raise NotImplementedError(self.get_sha1s)
2520.4.89 by Aaron Bentley
Add get_sha1s to weaves
387
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
388
    def get_text(self, version_id):
389
        """Return version contents as a text string.
390
391
        Raises RevisionNotPresent if version is not present in
392
        file history.
393
        """
394
        return ''.join(self.get_lines(version_id))
395
    get_string = get_text
396
1756.2.1 by Aaron Bentley
Implement get_texts
397
    def get_texts(self, version_ids):
398
        """Return the texts of listed versions as a list of strings.
399
400
        Raises RevisionNotPresent if version is not present in
401
        file history.
402
        """
403
        return [''.join(self.get_lines(v)) for v in version_ids]
404
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
405
    def get_lines(self, version_id):
406
        """Return version contents as a sequence of lines.
407
408
        Raises RevisionNotPresent if version is not present in
409
        file history.
410
        """
411
        raise NotImplementedError(self.get_lines)
412
2520.4.90 by Aaron Bentley
Handle \r terminated lines in Weaves properly
413
    def _get_lf_split_line_list(self, version_ids):
414
        return [StringIO(t).readlines() for t in self.get_texts(version_ids)]
2520.4.3 by Aaron Bentley
Implement plain strategy for extracting and installing multiparent diffs
415
2530.1.1 by Aaron Bentley
Make topological sorting optional for get_ancestry
416
    def get_ancestry(self, version_ids, topo_sorted=True):
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
417
        """Return a list of all ancestors of given version(s). This
418
        will not include the null revision.
419
2490.2.32 by Aaron Bentley
Merge of not-sorting-ancestry branch
420
        This list will not be topologically sorted if topo_sorted=False is
421
        passed.
2530.1.1 by Aaron Bentley
Make topological sorting optional for get_ancestry
422
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
423
        Must raise RevisionNotPresent if any of the given versions are
424
        not present in file history."""
425
        if isinstance(version_ids, basestring):
426
            version_ids = [version_ids]
427
        raise NotImplementedError(self.get_ancestry)
428
        
1594.2.8 by Robert Collins
add ghost aware apis to knits.
429
    def get_ancestry_with_ghosts(self, version_ids):
430
        """Return a list of all ancestors of given version(s). This
431
        will not include the null revision.
432
433
        Must raise RevisionNotPresent if any of the given versions are
434
        not present in file history.
435
        
436
        Ghosts that are known about will be included in ancestry list,
437
        but are not explicitly marked.
438
        """
439
        raise NotImplementedError(self.get_ancestry_with_ghosts)
3316.2.7 by Robert Collins
Actually deprecated VersionedFile.get_graph.
440
    
3287.5.1 by Robert Collins
Add VersionedFile.get_parent_map.
441
    def get_parent_map(self, version_ids):
442
        """Get a map of the parents of version_ids.
443
444
        :param version_ids: The version ids to look up parents for.
445
        :return: A mapping from version id to parents.
446
        """
447
        raise NotImplementedError(self.get_parent_map)
448
1594.2.8 by Robert Collins
add ghost aware apis to knits.
449
    def get_parents_with_ghosts(self, version_id):
450
        """Return version names for parents of version_id.
451
452
        Will raise RevisionNotPresent if version_id is not present
453
        in the history.
454
455
        Ghosts that are known about will be included in the parent list,
456
        but are not explicitly marked.
457
        """
3287.5.1 by Robert Collins
Add VersionedFile.get_parent_map.
458
        try:
459
            return list(self.get_parent_map([version_id])[version_id])
460
        except KeyError:
461
            raise errors.RevisionNotPresent(version_id, self)
1594.2.8 by Robert Collins
add ghost aware apis to knits.
462
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
463
    def annotate(self, version_id):
3316.2.13 by Robert Collins
* ``VersionedFile.annotate_iter`` is deprecated. While in principal this
464
        """Return a list of (version-id, line) tuples for version_id.
465
466
        :raise RevisionNotPresent: If the given version is
467
        not present in file history.
468
        """
469
        raise NotImplementedError(self.annotate)
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
470
2975.3.2 by Robert Collins
Review feedback - document the API change and improve readability in pack's _do_copy_nodes.
471
    def iter_lines_added_or_present_in_versions(self, version_ids=None,
2039.1.1 by Aaron Bentley
Clean up progress properly when interrupted during fetch (#54000)
472
                                                pb=None):
1594.2.6 by Robert Collins
Introduce a api specifically for looking at lines in some versions of the inventory, for fileid_involved.
473
        """Iterate over the lines in the versioned file from version_ids.
474
2975.3.2 by Robert Collins
Review feedback - document the API change and improve readability in pack's _do_copy_nodes.
475
        This may return lines from other versions. Each item the returned
476
        iterator yields is a tuple of a line and a text version that that line
477
        is present in (not introduced in).
478
479
        Ordering of results is in whatever order is most suitable for the
480
        underlying storage format.
1594.2.6 by Robert Collins
Introduce a api specifically for looking at lines in some versions of the inventory, for fileid_involved.
481
2039.1.1 by Aaron Bentley
Clean up progress properly when interrupted during fetch (#54000)
482
        If a progress bar is supplied, it may be used to indicate progress.
483
        The caller is responsible for cleaning up progress bars (because this
484
        is an iterator).
485
1594.2.6 by Robert Collins
Introduce a api specifically for looking at lines in some versions of the inventory, for fileid_involved.
486
        NOTES: Lines are normalised: they will all have \n terminators.
487
               Lines are returned in arbitrary order.
2975.3.2 by Robert Collins
Review feedback - document the API change and improve readability in pack's _do_copy_nodes.
488
489
        :return: An iterator over (line, version_id).
1594.2.6 by Robert Collins
Introduce a api specifically for looking at lines in some versions of the inventory, for fileid_involved.
490
        """
491
        raise NotImplementedError(self.iter_lines_added_or_present_in_versions)
492
1551.6.15 by Aaron Bentley
Moved plan_merge into Weave
493
    def plan_merge(self, ver_a, ver_b):
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
494
        """Return pseudo-annotation indicating how the two versions merge.
495
496
        This is computed between versions a and b and their common
497
        base.
498
499
        Weave lines present in none of them are skipped entirely.
1664.2.2 by Aaron Bentley
Added legend for plan-merge output
500
501
        Legend:
502
        killed-base Dead in base revision
503
        killed-both Killed in each revision
504
        killed-a    Killed in a
505
        killed-b    Killed in b
506
        unchanged   Alive in both a and b (possibly created in both)
507
        new-a       Created in a
508
        new-b       Created in b
1664.2.5 by Aaron Bentley
Update plan-merge legend
509
        ghost-a     Killed in a, unborn in b    
510
        ghost-b     Killed in b, unborn in a
1664.2.2 by Aaron Bentley
Added legend for plan-merge output
511
        irrelevant  Not in either revision
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
512
        """
1551.6.15 by Aaron Bentley
Moved plan_merge into Weave
513
        raise NotImplementedError(VersionedFile.plan_merge)
514
        
1996.3.7 by John Arbash Meinel
lazy import versionedfile, late-load bzrlib.merge
515
    def weave_merge(self, plan, a_marker=TextMerge.A_MARKER,
1551.6.14 by Aaron Bentley
Tweaks from merge review
516
                    b_marker=TextMerge.B_MARKER):
1551.6.12 by Aaron Bentley
Indicate conflicts from merge_lines, insead of guessing
517
        return PlanWeaveMerge(plan, a_marker, b_marker).merge_lines()[0]
1551.6.10 by Aaron Bentley
Renamed WeaveMerge to PlanMerge, added plan method, created planless WeaveMerge
518
1664.2.7 by Aaron Bentley
Merge bzr.dev
519
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
520
class RecordingVersionedFilesDecorator(object):
521
    """A minimal versioned files that records calls made on it.
3350.3.4 by Robert Collins
Finish adapters for annotated knits to unannotated knits and full texts.
522
    
523
    Only enough methods have been added to support tests using it to date.
524
525
    :ivar calls: A list of the calls made; can be reset at any time by
526
        assigning [] to it.
527
    """
528
529
    def __init__(self, backing_vf):
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
530
        """Create a RecordingVersionedFileDsecorator decorating backing_vf.
3350.3.4 by Robert Collins
Finish adapters for annotated knits to unannotated knits and full texts.
531
        
532
        :param backing_vf: The versioned file to answer all methods.
533
        """
534
        self._backing_vf = backing_vf
535
        self.calls = []
536
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
537
    def get_record_stream(self, keys, sort_order, include_delta_closure):
538
        self.calls.append(("get_record_stream", keys, sort_order,
539
            include_delta_closure))
540
        return self._backing_vf.get_record_stream(keys, sort_order,
541
            include_delta_closure)
542
543
544
class KeyMapper(object):
545
    """KeyMappers map between keys and underlying paritioned storage."""
546
547
    def map(self, key):
548
        """Map key to an underlying storage identifier.
549
550
        :param key: A key tuple e.g. ('file-id', 'revision-id').
551
        :return: An underlying storage identifier, specific to the partitioning
552
            mechanism.
553
        """
554
        raise NotImplementedError(self.map)
555
556
    def unmap(self, partition_id):
557
        """Map a partitioned storage id back to a key prefix.
558
        
559
        :param partition_id: The underlying partition id.
560
        :return: As much of a key (or prefix) as is derivable from the parition
561
            id.
562
        """
563
        raise NotImplementedError(self.unmap)
564
565
566
class ConstantMapper(KeyMapper):
567
    """A key mapper that maps to a constant result."""
568
569
    def __init__(self, result):
570
        """Create a ConstantMapper which will return result for all maps."""
571
        self._result = result
572
573
    def map(self, key):
574
        """See KeyMapper.map()."""
575
        return self._result
576
577
578
class URLEscapeMapper(KeyMapper):
579
    """Base class for use with transport backed storage.
580
581
    This provides a map and unmap wrapper that respectively url escape and
582
    unescape their outputs and inputs.
583
    """
584
585
    def map(self, key):
586
        """See KeyMapper.map()."""
587
        return urllib.quote(self._map(key))
588
589
    def unmap(self, partition_id):
590
        """See KeyMapper.unmap()."""
591
        return self._unmap(urllib.unquote(partition_id))
592
593
594
class PrefixMapper(URLEscapeMapper):
595
    """A key mapper that extracts the first component of a key.
596
    
597
    This mapper is for use with a transport based backend.
598
    """
599
600
    def _map(self, key):
601
        """See KeyMapper.map()."""
602
        return key[0]
603
604
    def _unmap(self, partition_id):
605
        """See KeyMapper.unmap()."""
606
        return (partition_id,)
607
608
609
class HashPrefixMapper(URLEscapeMapper):
610
    """A key mapper that combines the first component of a key with a hash.
611
612
    This mapper is for use with a transport based backend.
613
    """
614
615
    def _map(self, key):
616
        """See KeyMapper.map()."""
617
        prefix = self._escape(key[0])
618
        return "%02x/%s" % (adler32(prefix) & 0xff, prefix)
619
620
    def _escape(self, prefix):
621
        """No escaping needed here."""
622
        return prefix
623
624
    def _unmap(self, partition_id):
625
        """See KeyMapper.unmap()."""
626
        return (self._unescape(osutils.basename(partition_id)),)
627
628
    def _unescape(self, basename):
629
        """No unescaping needed for HashPrefixMapper."""
630
        return basename
631
632
633
class HashEscapedPrefixMapper(HashPrefixMapper):
634
    """Combines the escaped first component of a key with a hash.
635
    
636
    This mapper is for use with a transport based backend.
637
    """
638
639
    _safe = "abcdefghijklmnopqrstuvwxyz0123456789-_@,."
640
641
    def _escape(self, prefix):
642
        """Turn a key element into a filesystem safe string.
643
644
        This is similar to a plain urllib.quote, except
645
        it uses specific safe characters, so that it doesn't
646
        have to translate a lot of valid file ids.
647
        """
648
        # @ does not get escaped. This is because it is a valid
649
        # filesystem character we use all the time, and it looks
650
        # a lot better than seeing %40 all the time.
651
        r = [((c in self._safe) and c or ('%%%02x' % ord(c)))
652
             for c in prefix]
653
        return ''.join(r)
654
655
    def _unescape(self, basename):
656
        """Escaped names are easily unescaped by urlutils."""
657
        return urllib.unquote(basename)
658
659
660
def make_versioned_files_factory(versioned_file_factory, mapper):
661
    """Create a ThunkedVersionedFiles factory.
662
663
    This will create a callable which when called creates a
664
    ThunkedVersionedFiles on a transport, using mapper to access individual
665
    versioned files, and versioned_file_factory to create each individual file.
666
    """
667
    def factory(transport):
668
        return ThunkedVersionedFiles(transport, versioned_file_factory, mapper,
669
            lambda:True)
670
    return factory
671
672
673
class VersionedFiles(object):
674
    """Storage for many versioned files.
675
676
    This object allows a single keyspace for accessing the history graph and
677
    contents of named bytestrings.
678
679
    Currently no implementation allows the graph of different key prefixes to
680
    intersect, but the API does allow such implementations in the future.
681
    """
682
683
    def add_lines(self, key, parents, lines, parent_texts=None,
684
        left_matching_blocks=None, nostore_sha=None, random_id=False,
685
        check_content=True):
686
        """Add a text to the store.
687
688
        :param key: The key tuple of the text to add.
689
        :param parents: The parents key tuples of the text to add.
690
        :param lines: A list of lines. Each line must be a bytestring. And all
691
            of them except the last must be terminated with \n and contain no
692
            other \n's. The last line may either contain no \n's or a single
693
            terminating \n. If the lines list does meet this constraint the add
694
            routine may error or may succeed - but you will be unable to read
695
            the data back accurately. (Checking the lines have been split
696
            correctly is expensive and extremely unlikely to catch bugs so it
697
            is not done at runtime unless check_content is True.)
698
        :param parent_texts: An optional dictionary containing the opaque 
699
            representations of some or all of the parents of version_id to
700
            allow delta optimisations.  VERY IMPORTANT: the texts must be those
701
            returned by add_lines or data corruption can be caused.
702
        :param left_matching_blocks: a hint about which areas are common
703
            between the text and its left-hand-parent.  The format is
704
            the SequenceMatcher.get_matching_blocks format.
705
        :param nostore_sha: Raise ExistingContent and do not add the lines to
706
            the versioned file if the digest of the lines matches this.
707
        :param random_id: If True a random id has been selected rather than
708
            an id determined by some deterministic process such as a converter
709
            from a foreign VCS. When True the backend may choose not to check
710
            for uniqueness of the resulting key within the versioned file, so
711
            this should only be done when the result is expected to be unique
712
            anyway.
713
        :param check_content: If True, the lines supplied are verified to be
714
            bytestrings that are correctly formed lines.
715
        :return: The text sha1, the number of bytes in the text, and an opaque
716
                 representation of the inserted version which can be provided
717
                 back to future add_lines calls in the parent_texts dictionary.
718
        """
719
        raise NotImplementedError(self.add_lines)
720
721
    def add_mpdiffs(self, records):
722
        """Add mpdiffs to this VersionedFile.
723
724
        Records should be iterables of version, parents, expected_sha1,
725
        mpdiff. mpdiff should be a MultiParent instance.
726
        """
727
        vf_parents = {}
728
        mpvf = multiparent.MultiMemoryVersionedFile()
729
        versions = []
730
        for version, parent_ids, expected_sha1, mpdiff in records:
731
            versions.append(version)
732
            mpvf.add_diff(mpdiff, version, parent_ids)
733
        needed_parents = set()
734
        for version, parent_ids, expected_sha1, mpdiff in records:
735
            needed_parents.update(p for p in parent_ids
736
                                  if not mpvf.has_version(p))
737
        # It seems likely that adding all the present parents as fulltexts can
738
        # easily exhaust memory.
739
        present_parents = set(self.get_parent_map(needed_parents).keys())
740
        split_lines = osutils.split_lines
741
        for record in self.get_record_stream(present_parents, 'unordered',
742
            True):
743
            mpvf.add_version(split_lines(record.get_bytes_as('fulltext')),
744
                record.key, [])
745
        for (key, parent_keys, expected_sha1, mpdiff), lines in\
746
            zip(records, mpvf.get_line_list(versions)):
747
            if len(parent_keys) == 1:
748
                left_matching_blocks = list(mpdiff.get_matching_blocks(0,
749
                    mpvf.get_diff(parent_keys[0]).num_lines()))
750
            else:
751
                left_matching_blocks = None
752
            version_sha1, _, version_text = self.add_lines(key,
753
                parent_keys, lines, vf_parents,
754
                left_matching_blocks=left_matching_blocks)
755
            if version_sha1 != expected_sha1:
756
                raise errors.VersionedFileInvalidChecksum(version)
757
            vf_parents[key] = version_text
758
759
    def annotate(self, key):
760
        """Return a list of (version-key, line) tuples for the text of key.
761
762
        :raise RevisionNotPresent: If the key is not present.
763
        """
764
        raise NotImplementedError(self.annotate)
765
766
    def check(self, progress_bar=None):
767
        """Check this object for integrity."""
768
        raise NotImplementedError(self.check)
769
770
    @staticmethod
771
    def check_not_reserved_id(version_id):
772
        revision.check_not_reserved_id(version_id)
773
774
    def _check_lines_not_unicode(self, lines):
775
        """Check that lines being added to a versioned file are not unicode."""
776
        for line in lines:
777
            if line.__class__ is not str:
778
                raise errors.BzrBadParameterUnicode("lines")
779
780
    def _check_lines_are_lines(self, lines):
781
        """Check that the lines really are full lines without inline EOL."""
782
        for line in lines:
783
            if '\n' in line[:-1]:
784
                raise errors.BzrBadParameterContainsNewline("lines")
785
786
    def get_parent_map(self, keys):
787
        """Get a map of the parents of keys.
788
789
        :param keys: The keys to look up parents for.
790
        :return: A mapping from keys to parents. Absent keys are absent from
791
            the mapping.
792
        """
793
        raise NotImplementedError(self.get_parent_map)
794
795
    def get_record_stream(self, keys, ordering, include_delta_closure):
796
        """Get a stream of records for keys.
797
798
        :param keys: The keys to include.
799
        :param ordering: Either 'unordered' or 'topological'. A topologically
800
            sorted stream has compression parents strictly before their
801
            children.
802
        :param include_delta_closure: If True then the closure across any
803
            compression parents will be included (in the opaque data).
804
        :return: An iterator of ContentFactory objects, each of which is only
805
            valid until the iterator is advanced.
806
        """
807
        raise NotImplementedError(self.get_record_stream)
808
809
    def get_sha1s(self, keys):
810
        """Get the sha1's of the texts for the given keys.
811
812
        :param keys: The names of the keys to lookup
813
        :return: a list of sha1s matching keys.
814
        """
815
        raise NotImplementedError(self.get_sha1s)
816
817
    def insert_record_stream(self, stream):
818
        """Insert a record stream into this container.
819
820
        :param stream: A stream of records to insert. 
821
        :return: None
822
        :seealso VersionedFile.get_record_stream:
823
        """
824
        raise NotImplementedError
825
826
    def iter_lines_added_or_present_in_keys(self, keys, pb=None):
827
        """Iterate over the lines in the versioned files from keys.
828
829
        This may return lines from other keys. Each item the returned
830
        iterator yields is a tuple of a line and a text version that that line
831
        is present in (not introduced in).
832
833
        Ordering of results is in whatever order is most suitable for the
834
        underlying storage format.
835
836
        If a progress bar is supplied, it may be used to indicate progress.
837
        The caller is responsible for cleaning up progress bars (because this
838
        is an iterator).
839
840
        NOTES:
841
         * Lines are normalised by the underlying store: they will all have \n
842
           terminators.
843
         * Lines are returned in arbitrary order.
844
845
        :return: An iterator over (line, key).
846
        """
847
        raise NotImplementedError(self.iter_lines_added_or_present_in_keys)
848
849
    def keys(self):
850
        """Return a iterable of the keys for all the contained texts."""
851
        raise NotImplementedError(self.keys)
852
853
    def make_mpdiffs(self, keys):
854
        """Create multiparent diffs for specified keys."""
855
        keys_order = tuple(keys)
856
        keys = frozenset(keys)
857
        knit_keys = set(keys)
858
        parent_map = self.get_parent_map(keys)
859
        for parent_keys in parent_map.itervalues():
860
            if parent_keys:
861
                knit_keys.update(parent_keys)
862
        missing_keys = keys - set(parent_map)
863
        if missing_keys:
864
            raise errors.RevisionNotPresent(missing_keys.pop(), self)
865
        # We need to filter out ghosts, because we can't diff against them.
866
        maybe_ghosts = knit_keys - keys
867
        ghosts = maybe_ghosts - set(self.get_parent_map(maybe_ghosts))
868
        knit_keys.difference_update(ghosts)
869
        lines = {}
870
        split_lines = osutils.split_lines
871
        for record in self.get_record_stream(knit_keys, 'topological', True):
872
            lines[record.key] = split_lines(record.get_bytes_as('fulltext'))
873
            # line_block_dict = {}
874
            # for parent, blocks in record.extract_line_blocks():
875
            #   line_blocks[parent] = blocks
876
            # line_blocks[record.key] = line_block_dict
877
        diffs = []
878
        for key in keys_order:
879
            target = lines[key]
880
            parents = parent_map[key] or []
881
            # Note that filtering knit_keys can lead to a parent difference
882
            # between the creation and the application of the mpdiff.
883
            parent_lines = [lines[p] for p in parents if p in knit_keys]
884
            if len(parent_lines) > 0:
885
                left_parent_blocks = self._extract_blocks(key, parent_lines[0],
886
                    target)
887
            else:
888
                left_parent_blocks = None
889
            diffs.append(multiparent.MultiParent.from_lines(target,
890
                parent_lines, left_parent_blocks))
891
        return diffs
892
893
    def _extract_blocks(self, version_id, source, target):
894
        return None
895
896
897
class ThunkedVersionedFiles(VersionedFiles):
898
    """Storage for many versioned files thunked onto a 'VersionedFile' class.
899
900
    This object allows a single keyspace for accessing the history graph and
901
    contents of named bytestrings.
902
903
    Currently no implementation allows the graph of different key prefixes to
904
    intersect, but the API does allow such implementations in the future.
905
    """
906
907
    def __init__(self, transport, file_factory, mapper, is_locked):
908
        """Create a ThunkedVersionedFiles."""
909
        self._transport = transport
910
        self._file_factory = file_factory
911
        self._mapper = mapper
912
        self._is_locked = is_locked
913
914
    def add_lines(self, key, parents, lines, parent_texts=None,
915
        left_matching_blocks=None, nostore_sha=None, random_id=False,
916
        check_content=True):
917
        """See VersionedFiles.add_lines()."""
918
        path = self._mapper.map(key)
919
        version_id = key[-1]
920
        parents = [parent[-1] for parent in parents]
921
        vf = self._get_vf(path)
922
        try:
923
            try:
924
                return vf.add_lines_with_ghosts(version_id, parents, lines,
925
                    parent_texts=parent_texts,
926
                    left_matching_blocks=left_matching_blocks,
927
                    nostore_sha=nostore_sha, random_id=random_id,
928
                    check_content=check_content)
929
            except NotImplementedError:
930
                return vf.add_lines(version_id, parents, lines,
931
                    parent_texts=parent_texts,
932
                    left_matching_blocks=left_matching_blocks,
933
                    nostore_sha=nostore_sha, random_id=random_id,
934
                    check_content=check_content)
935
        except errors.NoSuchFile:
936
            # parent directory may be missing, try again.
937
            self._transport.mkdir(osutils.dirname(path))
938
            try:
939
                return vf.add_lines_with_ghosts(version_id, parents, lines,
940
                    parent_texts=parent_texts,
941
                    left_matching_blocks=left_matching_blocks,
942
                    nostore_sha=nostore_sha, random_id=random_id,
943
                    check_content=check_content)
944
            except NotImplementedError:
945
                return vf.add_lines(version_id, parents, lines,
946
                    parent_texts=parent_texts,
947
                    left_matching_blocks=left_matching_blocks,
948
                    nostore_sha=nostore_sha, random_id=random_id,
949
                    check_content=check_content)
950
951
    def annotate(self, key):
952
        """Return a list of (version-key, line) tuples for the text of key.
953
954
        :raise RevisionNotPresent: If the key is not present.
955
        """
956
        prefix = key[:-1]
957
        path = self._mapper.map(prefix)
958
        vf = self._get_vf(path)
959
        origins = vf.annotate(key[-1])
960
        result = []
961
        for origin, line in origins:
962
            result.append((prefix + (origin,), line))
963
        return result
964
965
    def check(self, progress_bar=None):
966
        """See VersionedFiles.check()."""
967
        for prefix, vf in self._iter_all_components():
968
            vf.check()
969
970
    def get_parent_map(self, keys):
971
        """Get a map of the parents of keys.
972
973
        :param keys: The keys to look up parents for.
974
        :return: A mapping from keys to parents. Absent keys are absent from
975
            the mapping.
976
        """
977
        prefixes = self._partition_keys(keys)
978
        result = {}
979
        for prefix, suffixes in prefixes.items():
980
            path = self._mapper.map(prefix)
981
            vf = self._get_vf(path)
982
            parent_map = vf.get_parent_map(suffixes)
983
            for key, parents in parent_map.items():
984
                result[prefix + (key,)] = tuple(
985
                    prefix + (parent,) for parent in parents)
986
        return result
987
988
    def _get_vf(self, path):
989
        if not self._is_locked():
990
            raise errors.ObjectNotLocked(self)
991
        return self._file_factory(path, self._transport, create=True,
992
            get_scope=lambda:None)
993
994
    def _partition_keys(self, keys):
995
        """Turn keys into a dict of prefix:suffix_list."""
996
        result = {}
997
        for key in keys:
998
            prefix_keys = result.setdefault(key[:-1], [])
999
            prefix_keys.append(key[-1])
1000
        return result
1001
1002
    def _get_all_prefixes(self):
1003
        # Identify all key prefixes.
1004
        # XXX: A bit hacky, needs polish.
1005
        if type(self._mapper) == ConstantMapper:
1006
            paths = [self._mapper.map(())]
1007
            prefixes = [()]
1008
        else:
1009
            relpaths = set()
1010
            for quoted_relpath in self._transport.iter_files_recursive():
1011
                path, ext = os.path.splitext(quoted_relpath)
1012
                relpaths.add(path)
1013
            paths = list(relpaths)
1014
            prefixes = [self._mapper.unmap(path) for path in paths]
1015
        return zip(paths, prefixes)
1016
1017
    def get_record_stream(self, keys, ordering, include_delta_closure):
1018
        """See VersionedFiles.get_record_stream()."""
1019
        # Ordering will be taken care of by each partitioned store; group keys
1020
        # by partition.
1021
        keys = sorted(keys)
1022
        for prefix, suffixes, vf in self._iter_keys_vf(keys):
1023
            suffixes = [(suffix,) for suffix in suffixes]
1024
            for record in vf.get_record_stream(suffixes, ordering,
1025
                include_delta_closure):
1026
                if record.parents is not None:
1027
                    record.parents = tuple(
1028
                        prefix + parent for parent in record.parents)
1029
                record.key = prefix + record.key
1030
                yield record
1031
1032
    def _iter_keys_vf(self, keys):
1033
        prefixes = self._partition_keys(keys)
1034
        sha1s = {}
1035
        for prefix, suffixes in prefixes.items():
1036
            path = self._mapper.map(prefix)
1037
            vf = self._get_vf(path)
1038
            yield prefix, suffixes, vf
1039
1040
    def get_sha1s(self, keys):
1041
        """See VersionedFiles.get_sha1s()."""
1042
        sha1s = {}
1043
        for prefix,suffixes, vf in self._iter_keys_vf(keys):
1044
            vf_sha1s = vf.get_sha1s(suffixes)
1045
            for suffix, sha1 in zip(suffixes, vf.get_sha1s(suffixes)):
1046
                sha1s[prefix + (suffix,)] = sha1
1047
        return [sha1s[key] for key in keys]
1048
1049
    def insert_record_stream(self, stream):
1050
        """Insert a record stream into this container.
1051
1052
        :param stream: A stream of records to insert. 
1053
        :return: None
1054
        :seealso VersionedFile.get_record_stream:
1055
        """
1056
        for record in stream:
1057
            prefix = record.key[:-1]
1058
            key = record.key[-1:]
1059
            if record.parents is not None:
1060
                parents = [parent[-1:] for parent in record.parents]
1061
            else:
1062
                parents = None
1063
            thunk_record = AdapterFactory(key, parents, record)
1064
            path = self._mapper.map(prefix)
1065
            # Note that this parses the file many times; we can do better but
1066
            # as this only impacts weaves in terms of performance, it is
1067
            # tolerable.
1068
            vf = self._get_vf(path)
1069
            vf.insert_record_stream([thunk_record])
1070
1071
    def iter_lines_added_or_present_in_keys(self, keys, pb=None):
1072
        """Iterate over the lines in the versioned files from keys.
1073
1074
        This may return lines from other keys. Each item the returned
1075
        iterator yields is a tuple of a line and a text version that that line
1076
        is present in (not introduced in).
1077
1078
        Ordering of results is in whatever order is most suitable for the
1079
        underlying storage format.
1080
1081
        If a progress bar is supplied, it may be used to indicate progress.
1082
        The caller is responsible for cleaning up progress bars (because this
1083
        is an iterator).
1084
1085
        NOTES:
1086
         * Lines are normalised by the underlying store: they will all have \n
1087
           terminators.
1088
         * Lines are returned in arbitrary order.
1089
1090
        :return: An iterator over (line, key).
1091
        """
1092
        for prefix, suffixes, vf in self._iter_keys_vf(keys):
1093
            for line, version in vf.iter_lines_added_or_present_in_versions(suffixes):
1094
                yield line, prefix + (version,)
1095
1096
    def _iter_all_components(self):
1097
        for path, prefix in self._get_all_prefixes():
1098
            yield prefix, self._get_vf(path)
1099
1100
    def keys(self):
1101
        """See VersionedFiles.keys()."""
1102
        result = set()
1103
        for prefix, vf in self._iter_all_components():
1104
            for suffix in vf.versions():
1105
                result.add(prefix + (suffix,))
1106
        return result
1107
1108
1109
class _PlanMergeVersionedFile(VersionedFiles):
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1110
    """A VersionedFile for uncommitted and committed texts.
1111
1112
    It is intended to allow merges to be planned with working tree texts.
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1113
    It implements only the small part of the VersionedFiles interface used by
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1114
    PlanMerge.  It falls back to multiple versionedfiles for data not stored in
1115
    _PlanMergeVersionedFile itself.
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1116
1117
    :ivar: fallback_versionedfiles a list of VersionedFiles objects that can be
1118
        queried for missing texts.
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1119
    """
1120
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1121
    def __init__(self, file_id):
1122
        """Create a _PlanMergeVersionedFile.
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1123
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1124
        :param file_id: Used with _PlanMerge code which is not yet fully
1125
            tuple-keyspace aware.
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1126
        """
1127
        self._file_id = file_id
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1128
        # fallback locations
1129
        self.fallback_versionedfiles = []
1130
        # Parents for locally held keys.
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1131
        self._parents = {}
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1132
        # line data for locally held keys.
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1133
        self._lines = {}
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1134
        # key lookup providers
1135
        self._providers = [DictParentsProvider(self._parents)]
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1136
3062.2.3 by Aaron Bentley
Sync up with bzr.dev API changes
1137
    def plan_merge(self, ver_a, ver_b, base=None):
3062.1.13 by Aaron Bentley
Make _PlanMerge an implementation detail of _PlanMergeVersionedFile
1138
        """See VersionedFile.plan_merge"""
3144.3.7 by Aaron Bentley
Update from review
1139
        from bzrlib.merge import _PlanMerge
3062.2.3 by Aaron Bentley
Sync up with bzr.dev API changes
1140
        if base is None:
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1141
            return _PlanMerge(ver_a, ver_b, self, (self._file_id,)).plan_merge()
1142
        old_plan = list(_PlanMerge(ver_a, base, self, (self._file_id,)).plan_merge())
1143
        new_plan = list(_PlanMerge(ver_a, ver_b, self, (self._file_id,)).plan_merge())
3062.2.3 by Aaron Bentley
Sync up with bzr.dev API changes
1144
        return _PlanMerge._subtract_plans(old_plan, new_plan)
1145
3144.3.1 by Aaron Bentley
Implement LCA merge, with problematic conflict markers
1146
    def plan_lca_merge(self, ver_a, ver_b, base=None):
3144.3.7 by Aaron Bentley
Update from review
1147
        from bzrlib.merge import _PlanLCAMerge
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1148
        graph = Graph(self)
1149
        new_plan = _PlanLCAMerge(ver_a, ver_b, self, (self._file_id,), graph).plan_merge()
3144.3.1 by Aaron Bentley
Implement LCA merge, with problematic conflict markers
1150
        if base is None:
1151
            return new_plan
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1152
        old_plan = _PlanLCAMerge(ver_a, base, self, (self._file_id,), graph).plan_merge()
3144.3.1 by Aaron Bentley
Implement LCA merge, with problematic conflict markers
1153
        return _PlanLCAMerge._subtract_plans(list(old_plan), list(new_plan))
3062.1.13 by Aaron Bentley
Make _PlanMerge an implementation detail of _PlanMergeVersionedFile
1154
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1155
    def add_lines(self, key, parents, lines):
1156
        """See VersionedFiles.add_lines
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1157
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1158
        Lines are added locally, not to fallback versionedfiles.  Also, ghosts
1159
        are permitted.  Only reserved ids are permitted.
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1160
        """
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1161
        if type(key) != tuple:
1162
            import pdb;pdb.set_trace()
1163
        if not revision.is_reserved_id(key[-1]):
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1164
            raise ValueError('Only reserved ids may be used')
1165
        if parents is None:
1166
            raise ValueError('Parents may not be None')
1167
        if lines is None:
1168
            raise ValueError('Lines may not be None')
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1169
        self._parents[key] = tuple(parents)
1170
        self._lines[key] = lines
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1171
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1172
    def get_record_stream(self, keys, ordering, include_delta_closure):
1173
        pending = set(keys)
1174
        for key in keys:
1175
            if key in self._lines:
1176
                lines = self._lines[key]
1177
                parents = self._parents[key]
1178
                pending.remove(key)
1179
                yield FulltextContentFactory(key, parents, None,
1180
                    ''.join(lines))
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1181
        for versionedfile in self.fallback_versionedfiles:
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1182
            for record in versionedfile.get_record_stream(
1183
                pending, 'unordered', True):
1184
                if record.storage_kind == 'absent':
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1185
                    continue
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1186
                else:
1187
                    pending.remove(record.key)
1188
                    yield record
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
1189
            if not pending:
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1190
                return
1191
        # report absent entries
1192
        for key in pending:
1193
            yield AbsentContentFactory(key)
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1194
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1195
    def get_parent_map(self, keys):
1196
        """See VersionedFiles.get_parent_map"""
1197
        # We create a new provider because a fallback may have been added.
1198
        # If we make fallbacks private we can update a stack list and avoid
1199
        # object creation thrashing.
1200
        self._providers = self._providers[:1] + self.fallback_versionedfiles
3350.6.5 by Robert Collins
Update to bzr.dev.
1201
        result = _StackedParentsProvider(self._providers).get_parent_map(keys)
1202
        for key, parents in result.iteritems():
1203
            if parents == ():
1204
                result[key] = (revision.NULL_REVISION,)
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
1205
        return result
3144.3.1 by Aaron Bentley
Implement LCA merge, with problematic conflict markers
1206
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1207
1551.6.10 by Aaron Bentley
Renamed WeaveMerge to PlanMerge, added plan method, created planless WeaveMerge
1208
class PlanWeaveMerge(TextMerge):
1551.6.13 by Aaron Bentley
Cleanup
1209
    """Weave merge that takes a plan as its input.
1210
    
1551.6.14 by Aaron Bentley
Tweaks from merge review
1211
    This exists so that VersionedFile.plan_merge is implementable.
1212
    Most callers will want to use WeaveMerge instead.
1551.6.13 by Aaron Bentley
Cleanup
1213
    """
1214
1551.6.14 by Aaron Bentley
Tweaks from merge review
1215
    def __init__(self, plan, a_marker=TextMerge.A_MARKER,
1216
                 b_marker=TextMerge.B_MARKER):
1551.6.10 by Aaron Bentley
Renamed WeaveMerge to PlanMerge, added plan method, created planless WeaveMerge
1217
        TextMerge.__init__(self, a_marker, b_marker)
1218
        self.plan = plan
1219
1551.6.7 by Aaron Bentley
Implemented two-way merge, refactored weave merge
1220
    def _merge_struct(self):
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
1221
        lines_a = []
1222
        lines_b = []
1223
        ch_a = ch_b = False
1664.2.8 by Aaron Bentley
Fix WeaveMerge when plan doesn't end with unchanged lines
1224
1225
        def outstanding_struct():
1226
            if not lines_a and not lines_b:
1227
                return
1228
            elif ch_a and not ch_b:
1229
                # one-sided change:
1230
                yield(lines_a,)
1231
            elif ch_b and not ch_a:
1232
                yield (lines_b,)
1233
            elif lines_a == lines_b:
1234
                yield(lines_a,)
1235
            else:
1236
                yield (lines_a, lines_b)
1551.6.13 by Aaron Bentley
Cleanup
1237
       
1616.1.18 by Martin Pool
(weave-merge) don't treat killed-both lines as points of agreement;
1238
        # We previously considered either 'unchanged' or 'killed-both' lines
1239
        # to be possible places to resynchronize.  However, assuming agreement
1759.2.1 by Jelmer Vernooij
Fix some types (found using aspell).
1240
        # on killed-both lines may be too aggressive. -- mbp 20060324
1551.6.7 by Aaron Bentley
Implemented two-way merge, refactored weave merge
1241
        for state, line in self.plan:
1616.1.18 by Martin Pool
(weave-merge) don't treat killed-both lines as points of agreement;
1242
            if state == 'unchanged':
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
1243
                # resync and flush queued conflicts changes if any
1664.2.8 by Aaron Bentley
Fix WeaveMerge when plan doesn't end with unchanged lines
1244
                for struct in outstanding_struct():
1245
                    yield struct
1551.6.11 by Aaron Bentley
Switched TextMerge_lines to work on a list
1246
                lines_a = []
1247
                lines_b = []
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
1248
                ch_a = ch_b = False
1249
                
1250
            if state == 'unchanged':
1251
                if line:
1551.6.5 by Aaron Bentley
Got weave merge producing structural output
1252
                    yield ([line],)
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
1253
            elif state == 'killed-a':
1254
                ch_a = True
1255
                lines_b.append(line)
1256
            elif state == 'killed-b':
1257
                ch_b = True
1258
                lines_a.append(line)
1259
            elif state == 'new-a':
1260
                ch_a = True
1261
                lines_a.append(line)
1262
            elif state == 'new-b':
1263
                ch_b = True
1264
                lines_b.append(line)
3144.3.2 by Aaron Bentley
Get conflict handling working
1265
            elif state == 'conflicted-a':
1266
                ch_b = ch_a = True
1267
                lines_a.append(line)
1268
            elif state == 'conflicted-b':
1269
                ch_b = ch_a = True
1270
                lines_b.append(line)
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
1271
            else:
3376.2.4 by Martin Pool
Remove every assert statement from bzrlib!
1272
                if state not in ('irrelevant', 'ghost-a', 'ghost-b',
1273
                        'killed-base', 'killed-both'):
1274
                    raise AssertionError(state)
1664.2.8 by Aaron Bentley
Fix WeaveMerge when plan doesn't end with unchanged lines
1275
        for struct in outstanding_struct():
1276
            yield struct
1563.2.12 by Robert Collins
Checkpointing: created InterObject to factor out common inter object worker code, added InterVersionedFile and tests to allow making join work between any versionedfile.
1277
1664.2.14 by Aaron Bentley
spacing fix
1278
1551.6.10 by Aaron Bentley
Renamed WeaveMerge to PlanMerge, added plan method, created planless WeaveMerge
1279
class WeaveMerge(PlanWeaveMerge):
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
1280
    """Weave merge that takes a VersionedFile and two versions as its input."""
1551.6.13 by Aaron Bentley
Cleanup
1281
1551.6.10 by Aaron Bentley
Renamed WeaveMerge to PlanMerge, added plan method, created planless WeaveMerge
1282
    def __init__(self, versionedfile, ver_a, ver_b, 
1551.6.14 by Aaron Bentley
Tweaks from merge review
1283
        a_marker=PlanWeaveMerge.A_MARKER, b_marker=PlanWeaveMerge.B_MARKER):
1551.6.15 by Aaron Bentley
Moved plan_merge into Weave
1284
        plan = versionedfile.plan_merge(ver_a, ver_b)
1551.6.10 by Aaron Bentley
Renamed WeaveMerge to PlanMerge, added plan method, created planless WeaveMerge
1285
        PlanWeaveMerge.__init__(self, plan, a_marker, b_marker)
1286
1287