/brz/remove-bazaar

To get this branch, use:
bzr branch http://gegoxaren.bato24.eu/bzr/brz/remove-bazaar
3830.3.20 by John Arbash Meinel
Minor PEP8 and copyright updates.
1
# Copyright (C) 2005, 2006, 2007, 2008 Canonical Ltd
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
2
#
3
# Authors:
4
#   Johan Rydberg <jrydberg@gnu.org>
5
#
6
# This program is free software; you can redistribute it and/or modify
7
# it under the terms of the GNU General Public License as published by
8
# the Free Software Foundation; either version 2 of the License, or
9
# (at your option) any later version.
1887.1.1 by Adeodato Simó
Do not separate paragraphs in the copyright statement with blank lines,
10
#
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
11
# This program is distributed in the hope that it will be useful,
12
# but WITHOUT ANY WARRANTY; without even the implied warranty of
13
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
14
# GNU General Public License for more details.
1887.1.1 by Adeodato Simó
Do not separate paragraphs in the copyright statement with blank lines,
15
#
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
16
# You should have received a copy of the GNU General Public License
17
# along with this program; if not, write to the Free Software
18
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA  02111-1307  USA
19
20
"""Versioned text file storage api."""
21
3350.8.2 by Robert Collins
stacked get_parent_map.
22
from copy import copy
3350.6.1 by Robert Collins
* New ``versionedfile.KeyMapper`` interface to abstract out the access to
23
from cStringIO import StringIO
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
24
import os
3350.6.1 by Robert Collins
* New ``versionedfile.KeyMapper`` interface to abstract out the access to
25
from zlib import adler32
26
1996.3.7 by John Arbash Meinel
lazy import versionedfile, late-load bzrlib.merge
27
from bzrlib.lazy_import import lazy_import
28
lazy_import(globals(), """
3224.5.20 by Andrew Bennetts
Remove or lazyify a couple more imports.
29
import urllib
1996.3.7 by John Arbash Meinel
lazy import versionedfile, late-load bzrlib.merge
30
31
from bzrlib import (
32
    errors,
3830.3.12 by Martin Pool
Review cleanups: unify has_key impls, add missing_keys(), clean up exception blocks
33
    index,
4005.3.2 by Robert Collins
First passing NetworkRecordStream test - a fulltext from any record type which isn't a chunked or fulltext can be serialised and deserialised successfully.
34
    knit,
2249.5.12 by John Arbash Meinel
Change the APIs for VersionedFile, Store, and some of Repository into utf-8
35
    osutils,
2520.4.3 by Aaron Bentley
Implement plain strategy for extracting and installing multiparent diffs
36
    multiparent,
1996.3.7 by John Arbash Meinel
lazy import versionedfile, late-load bzrlib.merge
37
    tsort,
2229.2.1 by Aaron Bentley
Reject reserved ids in versiondfile, tree, branch and repository
38
    revision,
1996.3.7 by John Arbash Meinel
lazy import versionedfile, late-load bzrlib.merge
39
    ui,
40
    )
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
41
from bzrlib.graph import DictParentsProvider, Graph, _StackedParentsProvider
1996.3.7 by John Arbash Meinel
lazy import versionedfile, late-load bzrlib.merge
42
from bzrlib.transport.memory import MemoryTransport
43
""")
1563.2.12 by Robert Collins
Checkpointing: created InterObject to factor out common inter object worker code, added InterVersionedFile and tests to allow making join work between any versionedfile.
44
from bzrlib.inter import InterObject
3350.3.7 by Robert Collins
Create a registry of versioned file record adapters.
45
from bzrlib.registry import Registry
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
46
from bzrlib.symbol_versioning import *
1551.6.7 by Aaron Bentley
Implemented two-way merge, refactored weave merge
47
from bzrlib.textmerge import TextMerge
1563.2.11 by Robert Collins
Consolidate reweave and join as we have no separate usage, make reweave tests apply to all versionedfile implementations and deprecate the old reweave apis.
48
49
3350.3.7 by Robert Collins
Create a registry of versioned file record adapters.
50
adapter_registry = Registry()
51
adapter_registry.register_lazy(('knit-delta-gz', 'fulltext'), 'bzrlib.knit',
52
    'DeltaPlainToFullText')
53
adapter_registry.register_lazy(('knit-ft-gz', 'fulltext'), 'bzrlib.knit',
54
    'FTPlainToFullText')
55
adapter_registry.register_lazy(('knit-annotated-delta-gz', 'knit-delta-gz'),
56
    'bzrlib.knit', 'DeltaAnnotatedToUnannotated')
57
adapter_registry.register_lazy(('knit-annotated-delta-gz', 'fulltext'),
58
    'bzrlib.knit', 'DeltaAnnotatedToFullText')
59
adapter_registry.register_lazy(('knit-annotated-ft-gz', 'knit-ft-gz'),
60
    'bzrlib.knit', 'FTAnnotatedToUnannotated')
61
adapter_registry.register_lazy(('knit-annotated-ft-gz', 'fulltext'),
62
    'bzrlib.knit', 'FTAnnotatedToFullText')
3890.2.1 by John Arbash Meinel
Start working on a ChunkedContentFactory.
63
# adapter_registry.register_lazy(('knit-annotated-ft-gz', 'chunked'),
64
#     'bzrlib.knit', 'FTAnnotatedToChunked')
3350.3.7 by Robert Collins
Create a registry of versioned file record adapters.
65
66
3350.3.3 by Robert Collins
Functional get_record_stream interface tests covering full interface.
67
class ContentFactory(object):
68
    """Abstract interface for insertion and retrieval from a VersionedFile.
69
    
70
    :ivar sha1: None, or the sha1 of the content fulltext.
71
    :ivar storage_kind: The native storage kind of this factory. One of
72
        'mpdiff', 'knit-annotated-ft', 'knit-annotated-delta', 'knit-ft',
73
        'knit-delta', 'fulltext', 'knit-annotated-ft-gz',
74
        'knit-annotated-delta-gz', 'knit-ft-gz', 'knit-delta-gz'.
75
    :ivar key: The key of this content. Each key is a tuple with a single
76
        string in it.
77
    :ivar parents: A tuple of parent keys for self.key. If the object has
78
        no parent information, None (as opposed to () for an empty list of
79
        parents).
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
80
    """
3350.3.3 by Robert Collins
Functional get_record_stream interface tests covering full interface.
81
82
    def __init__(self):
83
        """Create a ContentFactory."""
84
        self.sha1 = None
85
        self.storage_kind = None
86
        self.key = None
87
        self.parents = None
88
89
3890.2.1 by John Arbash Meinel
Start working on a ChunkedContentFactory.
90
class ChunkedContentFactory(ContentFactory):
91
    """Static data content factory.
92
93
    This takes a 'chunked' list of strings. The only requirement on 'chunked' is
94
    that ''.join(lines) becomes a valid fulltext. A tuple of a single string
95
    satisfies this, as does a list of lines.
96
97
    :ivar sha1: None, or the sha1 of the content fulltext.
98
    :ivar storage_kind: The native storage kind of this factory. Always
3890.2.2 by John Arbash Meinel
Change the signature to report the storage kind as 'chunked'
99
        'chunked'
3890.2.1 by John Arbash Meinel
Start working on a ChunkedContentFactory.
100
    :ivar key: The key of this content. Each key is a tuple with a single
101
        string in it.
102
    :ivar parents: A tuple of parent keys for self.key. If the object has
103
        no parent information, None (as opposed to () for an empty list of
104
        parents).
105
     """
106
107
    def __init__(self, key, parents, sha1, chunks):
108
        """Create a ContentFactory."""
109
        self.sha1 = sha1
3890.2.2 by John Arbash Meinel
Change the signature to report the storage kind as 'chunked'
110
        self.storage_kind = 'chunked'
3890.2.1 by John Arbash Meinel
Start working on a ChunkedContentFactory.
111
        self.key = key
112
        self.parents = parents
113
        self._chunks = chunks
114
115
    def get_bytes_as(self, storage_kind):
116
        if storage_kind == 'chunked':
117
            return self._chunks
118
        elif storage_kind == 'fulltext':
119
            return ''.join(self._chunks)
120
        raise errors.UnavailableRepresentation(self.key, storage_kind,
121
            self.storage_kind)
122
123
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
124
class FulltextContentFactory(ContentFactory):
125
    """Static data content factory.
126
127
    This takes a fulltext when created and just returns that during
128
    get_bytes_as('fulltext').
3890.2.1 by John Arbash Meinel
Start working on a ChunkedContentFactory.
129
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
130
    :ivar sha1: None, or the sha1 of the content fulltext.
131
    :ivar storage_kind: The native storage kind of this factory. Always
132
        'fulltext'.
133
    :ivar key: The key of this content. Each key is a tuple with a single
134
        string in it.
135
    :ivar parents: A tuple of parent keys for self.key. If the object has
136
        no parent information, None (as opposed to () for an empty list of
137
        parents).
138
     """
139
140
    def __init__(self, key, parents, sha1, text):
141
        """Create a ContentFactory."""
142
        self.sha1 = sha1
143
        self.storage_kind = 'fulltext'
144
        self.key = key
145
        self.parents = parents
146
        self._text = text
147
148
    def get_bytes_as(self, storage_kind):
149
        if storage_kind == self.storage_kind:
150
            return self._text
3890.2.1 by John Arbash Meinel
Start working on a ChunkedContentFactory.
151
        elif storage_kind == 'chunked':
3976.2.1 by Robert Collins
Use a list not a tuple for chunks returned from FullTextContentFactory objects, because otherwise code tries to assign to tuples.
152
            return [self._text]
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
153
        raise errors.UnavailableRepresentation(self.key, storage_kind,
154
            self.storage_kind)
155
156
157
class AbsentContentFactory(ContentFactory):
3350.3.12 by Robert Collins
Generate streams with absent records.
158
    """A placeholder content factory for unavailable texts.
159
    
160
    :ivar sha1: None.
161
    :ivar storage_kind: 'absent'.
162
    :ivar key: The key of this content. Each key is a tuple with a single
163
        string in it.
164
    :ivar parents: None.
165
    """
166
167
    def __init__(self, key):
168
        """Create a ContentFactory."""
169
        self.sha1 = None
170
        self.storage_kind = 'absent'
171
        self.key = key
172
        self.parents = None
173
174
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
175
class AdapterFactory(ContentFactory):
176
    """A content factory to adapt between key prefix's."""
177
178
    def __init__(self, key, parents, adapted):
179
        """Create an adapter factory instance."""
180
        self.key = key
181
        self.parents = parents
182
        self._adapted = adapted
183
184
    def __getattr__(self, attr):
185
        """Return a member from the adapted object."""
186
        if attr in ('key', 'parents'):
187
            return self.__dict__[attr]
188
        else:
189
            return getattr(self._adapted, attr)
190
191
3350.3.14 by Robert Collins
Deprecate VersionedFile.join.
192
def filter_absent(record_stream):
193
    """Adapt a record stream to remove absent records."""
194
    for record in record_stream:
195
        if record.storage_kind != 'absent':
196
            yield record
197
198
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
199
class VersionedFile(object):
200
    """Versioned text file storage.
201
    
202
    A versioned file manages versions of line-based text files,
203
    keeping track of the originating version for each line.
204
205
    To clients the "lines" of the file are represented as a list of
206
    strings. These strings will typically have terminal newline
207
    characters, but this is not required.  In particular files commonly
208
    do not have a newline at the end of the file.
209
210
    Texts are identified by a version-id string.
211
    """
212
2229.2.1 by Aaron Bentley
Reject reserved ids in versiondfile, tree, branch and repository
213
    @staticmethod
2229.2.3 by Aaron Bentley
change reserved_id to is_reserved_id, add check_not_reserved for DRY
214
    def check_not_reserved_id(version_id):
215
        revision.check_not_reserved_id(version_id)
2229.2.1 by Aaron Bentley
Reject reserved ids in versiondfile, tree, branch and repository
216
1563.2.15 by Robert Collins
remove the weavestore assumptions about the number and nature of files it manages.
217
    def copy_to(self, name, transport):
218
        """Copy this versioned file to name on transport."""
219
        raise NotImplementedError(self.copy_to)
1863.1.1 by John Arbash Meinel
Allow Versioned files to do caching if explicitly asked, and implement for Knit
220
3350.3.3 by Robert Collins
Functional get_record_stream interface tests covering full interface.
221
    def get_record_stream(self, versions, ordering, include_delta_closure):
222
        """Get a stream of records for versions.
223
224
        :param versions: The versions to include. Each version is a tuple
225
            (version,).
226
        :param ordering: Either 'unordered' or 'topological'. A topologically
227
            sorted stream has compression parents strictly before their
228
            children.
229
        :param include_delta_closure: If True then the closure across any
3350.3.22 by Robert Collins
Review feedback.
230
            compression parents will be included (in the data content of the
231
            stream, not in the emitted records). This guarantees that
232
            'fulltext' can be used successfully on every record.
3350.3.3 by Robert Collins
Functional get_record_stream interface tests covering full interface.
233
        :return: An iterator of ContentFactory objects, each of which is only
234
            valid until the iterator is advanced.
235
        """
236
        raise NotImplementedError(self.get_record_stream)
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
237
238
    def has_version(self, version_id):
239
        """Returns whether version is present."""
240
        raise NotImplementedError(self.has_version)
241
3350.3.8 by Robert Collins
Basic stream insertion, no fast path yet for knit to knit.
242
    def insert_record_stream(self, stream):
243
        """Insert a record stream into this versioned file.
244
245
        :param stream: A stream of records to insert. 
246
        :return: None
247
        :seealso VersionedFile.get_record_stream:
248
        """
249
        raise NotImplementedError
250
2520.4.140 by Aaron Bentley
Use matching blocks from mpdiff for knit delta creation
251
    def add_lines(self, version_id, parents, lines, parent_texts=None,
2805.6.7 by Robert Collins
Review feedback.
252
        left_matching_blocks=None, nostore_sha=None, random_id=False,
253
        check_content=True):
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
254
        """Add a single text on top of the versioned file.
255
256
        Must raise RevisionAlreadyPresent if the new version is
257
        already present in file history.
258
259
        Must raise RevisionNotPresent if any of the given parents are
1594.2.21 by Robert Collins
Teach versioned files to prevent mutation after finishing.
260
        not present in file history.
2805.6.3 by Robert Collins
* The ``VersionedFile`` interface no longer protects against misuse when
261
262
        :param lines: A list of lines. Each line must be a bytestring. And all
263
            of them except the last must be terminated with \n and contain no
264
            other \n's. The last line may either contain no \n's or a single
265
            terminated \n. If the lines list does meet this constraint the add
266
            routine may error or may succeed - but you will be unable to read
267
            the data back accurately. (Checking the lines have been split
2805.6.7 by Robert Collins
Review feedback.
268
            correctly is expensive and extremely unlikely to catch bugs so it
269
            is not done at runtime unless check_content is True.)
1596.2.32 by Robert Collins
Reduce re-extraction of texts during weave to knit joins by providing a memoisation facility.
270
        :param parent_texts: An optional dictionary containing the opaque 
2805.6.3 by Robert Collins
* The ``VersionedFile`` interface no longer protects against misuse when
271
            representations of some or all of the parents of version_id to
272
            allow delta optimisations.  VERY IMPORTANT: the texts must be those
273
            returned by add_lines or data corruption can be caused.
2520.4.148 by Aaron Bentley
Updates from review
274
        :param left_matching_blocks: a hint about which areas are common
275
            between the text and its left-hand-parent.  The format is
276
            the SequenceMatcher.get_matching_blocks format.
2794.1.1 by Robert Collins
Allow knits to be instructed not to add a text based on a sha, for commit.
277
        :param nostore_sha: Raise ExistingContent and do not add the lines to
278
            the versioned file if the digest of the lines matches this.
2805.6.4 by Robert Collins
Don't check for existing versions when adding texts with random revision ids.
279
        :param random_id: If True a random id has been selected rather than
280
            an id determined by some deterministic process such as a converter
281
            from a foreign VCS. When True the backend may choose not to check
282
            for uniqueness of the resulting key within the versioned file, so
283
            this should only be done when the result is expected to be unique
284
            anyway.
2805.6.7 by Robert Collins
Review feedback.
285
        :param check_content: If True, the lines supplied are verified to be
286
            bytestrings that are correctly formed lines.
2776.1.1 by Robert Collins
* The ``add_lines`` methods on ``VersionedFile`` implementations has changed
287
        :return: The text sha1, the number of bytes in the text, and an opaque
288
                 representation of the inserted version which can be provided
289
                 back to future add_lines calls in the parent_texts dictionary.
1594.2.21 by Robert Collins
Teach versioned files to prevent mutation after finishing.
290
        """
1594.2.23 by Robert Collins
Test versioned file storage handling of clean/dirty status for accessed versioned files.
291
        self._check_write_ok()
2520.4.140 by Aaron Bentley
Use matching blocks from mpdiff for knit delta creation
292
        return self._add_lines(version_id, parents, lines, parent_texts,
2805.6.7 by Robert Collins
Review feedback.
293
            left_matching_blocks, nostore_sha, random_id, check_content)
1594.2.21 by Robert Collins
Teach versioned files to prevent mutation after finishing.
294
2520.4.140 by Aaron Bentley
Use matching blocks from mpdiff for knit delta creation
295
    def _add_lines(self, version_id, parents, lines, parent_texts,
2805.6.7 by Robert Collins
Review feedback.
296
        left_matching_blocks, nostore_sha, random_id, check_content):
1594.2.21 by Robert Collins
Teach versioned files to prevent mutation after finishing.
297
        """Helper to do the class specific add_lines."""
1563.2.4 by Robert Collins
First cut at including the knit implementation of versioned_file.
298
        raise NotImplementedError(self.add_lines)
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
299
1596.2.32 by Robert Collins
Reduce re-extraction of texts during weave to knit joins by providing a memoisation facility.
300
    def add_lines_with_ghosts(self, version_id, parents, lines,
2805.6.7 by Robert Collins
Review feedback.
301
        parent_texts=None, nostore_sha=None, random_id=False,
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
302
        check_content=True, left_matching_blocks=None):
1596.2.32 by Robert Collins
Reduce re-extraction of texts during weave to knit joins by providing a memoisation facility.
303
        """Add lines to the versioned file, allowing ghosts to be present.
304
        
2794.1.1 by Robert Collins
Allow knits to be instructed not to add a text based on a sha, for commit.
305
        This takes the same parameters as add_lines and returns the same.
1596.2.32 by Robert Collins
Reduce re-extraction of texts during weave to knit joins by providing a memoisation facility.
306
        """
1594.2.23 by Robert Collins
Test versioned file storage handling of clean/dirty status for accessed versioned files.
307
        self._check_write_ok()
1596.2.32 by Robert Collins
Reduce re-extraction of texts during weave to knit joins by providing a memoisation facility.
308
        return self._add_lines_with_ghosts(version_id, parents, lines,
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
309
            parent_texts, nostore_sha, random_id, check_content, left_matching_blocks)
1594.2.21 by Robert Collins
Teach versioned files to prevent mutation after finishing.
310
2794.1.1 by Robert Collins
Allow knits to be instructed not to add a text based on a sha, for commit.
311
    def _add_lines_with_ghosts(self, version_id, parents, lines, parent_texts,
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
312
        nostore_sha, random_id, check_content, left_matching_blocks):
1594.2.21 by Robert Collins
Teach versioned files to prevent mutation after finishing.
313
        """Helper to do class specific add_lines_with_ghosts."""
1594.2.8 by Robert Collins
add ghost aware apis to knits.
314
        raise NotImplementedError(self.add_lines_with_ghosts)
315
1563.2.19 by Robert Collins
stub out a check for knits.
316
    def check(self, progress_bar=None):
317
        """Check the versioned file for integrity."""
318
        raise NotImplementedError(self.check)
319
1666.1.6 by Robert Collins
Make knit the default format.
320
    def _check_lines_not_unicode(self, lines):
321
        """Check that lines being added to a versioned file are not unicode."""
322
        for line in lines:
323
            if line.__class__ is not str:
324
                raise errors.BzrBadParameterUnicode("lines")
325
326
    def _check_lines_are_lines(self, lines):
327
        """Check that the lines really are full lines without inline EOL."""
328
        for line in lines:
329
            if '\n' in line[:-1]:
330
                raise errors.BzrBadParameterContainsNewline("lines")
331
2535.3.1 by Andrew Bennetts
Add get_format_signature to VersionedFile
332
    def get_format_signature(self):
333
        """Get a text description of the data encoding in this file.
334
        
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
335
        :since: 0.90
2535.3.1 by Andrew Bennetts
Add get_format_signature to VersionedFile
336
        """
337
        raise NotImplementedError(self.get_format_signature)
338
2520.4.41 by Aaron Bentley
Accelerate mpdiff generation
339
    def make_mpdiffs(self, version_ids):
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
340
        """Create multiparent diffs for specified versions."""
2520.4.41 by Aaron Bentley
Accelerate mpdiff generation
341
        knit_versions = set()
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
342
        knit_versions.update(version_ids)
343
        parent_map = self.get_parent_map(version_ids)
2520.4.41 by Aaron Bentley
Accelerate mpdiff generation
344
        for version_id in version_ids:
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
345
            try:
346
                knit_versions.update(parent_map[version_id])
347
            except KeyError:
3453.3.1 by Daniel Fischer
Raise the right exception in make_mpdiffs (bug #235687)
348
                raise errors.RevisionNotPresent(version_id, self)
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
349
        # We need to filter out ghosts, because we can't diff against them.
350
        knit_versions = set(self.get_parent_map(knit_versions).keys())
2520.4.90 by Aaron Bentley
Handle \r terminated lines in Weaves properly
351
        lines = dict(zip(knit_versions,
352
            self._get_lf_split_line_list(knit_versions)))
2520.4.41 by Aaron Bentley
Accelerate mpdiff generation
353
        diffs = []
354
        for version_id in version_ids:
355
            target = lines[version_id]
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
356
            try:
357
                parents = [lines[p] for p in parent_map[version_id] if p in
358
                    knit_versions]
359
            except KeyError:
3453.3.2 by John Arbash Meinel
Add a test case for the first loop, unable to find a way to trigger the second loop
360
                # I don't know how this could ever trigger.
361
                # parent_map[version_id] was already triggered in the previous
362
                # for loop, and lines[p] has the 'if p in knit_versions' check,
363
                # so we again won't have a KeyError.
3453.3.1 by Daniel Fischer
Raise the right exception in make_mpdiffs (bug #235687)
364
                raise errors.RevisionNotPresent(version_id, self)
2520.4.48 by Aaron Bentley
Support getting blocks from knit deltas with no final EOL
365
            if len(parents) > 0:
366
                left_parent_blocks = self._extract_blocks(version_id,
367
                                                          parents[0], target)
368
            else:
369
                left_parent_blocks = None
2520.4.41 by Aaron Bentley
Accelerate mpdiff generation
370
            diffs.append(multiparent.MultiParent.from_lines(target, parents,
371
                         left_parent_blocks))
372
        return diffs
373
2520.4.48 by Aaron Bentley
Support getting blocks from knit deltas with no final EOL
374
    def _extract_blocks(self, version_id, source, target):
2520.4.41 by Aaron Bentley
Accelerate mpdiff generation
375
        return None
2520.4.3 by Aaron Bentley
Implement plain strategy for extracting and installing multiparent diffs
376
2520.4.61 by Aaron Bentley
Do bulk insertion of records
377
    def add_mpdiffs(self, records):
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
378
        """Add mpdiffs to this VersionedFile.
2520.4.126 by Aaron Bentley
Add more docs
379
380
        Records should be iterables of version, parents, expected_sha1,
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
381
        mpdiff. mpdiff should be a MultiParent instance.
2520.4.126 by Aaron Bentley
Add more docs
382
        """
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
383
        # Does this need to call self._check_write_ok()? (IanC 20070919)
2520.4.61 by Aaron Bentley
Do bulk insertion of records
384
        vf_parents = {}
2520.4.141 by Aaron Bentley
More batch operations adding mpdiffs
385
        mpvf = multiparent.MultiMemoryVersionedFile()
386
        versions = []
387
        for version, parent_ids, expected_sha1, mpdiff in records:
388
            versions.append(version)
389
            mpvf.add_diff(mpdiff, version, parent_ids)
390
        needed_parents = set()
2520.4.142 by Aaron Bentley
Clean up installation of inventory records
391
        for version, parent_ids, expected_sha1, mpdiff in records:
2520.4.141 by Aaron Bentley
More batch operations adding mpdiffs
392
            needed_parents.update(p for p in parent_ids
393
                                  if not mpvf.has_version(p))
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
394
        present_parents = set(self.get_parent_map(needed_parents).keys())
395
        for parent_id, lines in zip(present_parents,
396
                                 self._get_lf_split_line_list(present_parents)):
2520.4.141 by Aaron Bentley
More batch operations adding mpdiffs
397
            mpvf.add_version(lines, parent_id, [])
398
        for (version, parent_ids, expected_sha1, mpdiff), lines in\
399
            zip(records, mpvf.get_line_list(versions)):
400
            if len(parent_ids) == 1:
2520.4.140 by Aaron Bentley
Use matching blocks from mpdiff for knit delta creation
401
                left_matching_blocks = list(mpdiff.get_matching_blocks(0,
2520.4.141 by Aaron Bentley
More batch operations adding mpdiffs
402
                    mpvf.get_diff(parent_ids[0]).num_lines()))
2520.4.140 by Aaron Bentley
Use matching blocks from mpdiff for knit delta creation
403
            else:
404
                left_matching_blocks = None
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
405
            try:
406
                _, _, version_text = self.add_lines_with_ghosts(version,
407
                    parent_ids, lines, vf_parents,
408
                    left_matching_blocks=left_matching_blocks)
409
            except NotImplementedError:
410
                # The vf can't handle ghosts, so add lines normally, which will
411
                # (reasonably) fail if there are ghosts in the data.
412
                _, _, version_text = self.add_lines(version,
413
                    parent_ids, lines, vf_parents,
414
                    left_matching_blocks=left_matching_blocks)
2520.4.61 by Aaron Bentley
Do bulk insertion of records
415
            vf_parents[version] = version_text
3350.8.3 by Robert Collins
VF.get_sha1s needed changing to be stackable.
416
        sha1s = self.get_sha1s(versions)
417
        for version, parent_ids, expected_sha1, mpdiff in records:
418
            if expected_sha1 != sha1s[version]:
2520.4.71 by Aaron Bentley
Update test to accept VersionedFileInvalidChecksum instead of TestamentMismatch
419
                raise errors.VersionedFileInvalidChecksum(version)
2520.4.3 by Aaron Bentley
Implement plain strategy for extracting and installing multiparent diffs
420
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
421
    def get_text(self, version_id):
422
        """Return version contents as a text string.
423
424
        Raises RevisionNotPresent if version is not present in
425
        file history.
426
        """
427
        return ''.join(self.get_lines(version_id))
428
    get_string = get_text
429
1756.2.1 by Aaron Bentley
Implement get_texts
430
    def get_texts(self, version_ids):
431
        """Return the texts of listed versions as a list of strings.
432
433
        Raises RevisionNotPresent if version is not present in
434
        file history.
435
        """
436
        return [''.join(self.get_lines(v)) for v in version_ids]
437
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
438
    def get_lines(self, version_id):
439
        """Return version contents as a sequence of lines.
440
441
        Raises RevisionNotPresent if version is not present in
442
        file history.
443
        """
444
        raise NotImplementedError(self.get_lines)
445
2520.4.90 by Aaron Bentley
Handle \r terminated lines in Weaves properly
446
    def _get_lf_split_line_list(self, version_ids):
447
        return [StringIO(t).readlines() for t in self.get_texts(version_ids)]
2520.4.3 by Aaron Bentley
Implement plain strategy for extracting and installing multiparent diffs
448
2530.1.1 by Aaron Bentley
Make topological sorting optional for get_ancestry
449
    def get_ancestry(self, version_ids, topo_sorted=True):
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
450
        """Return a list of all ancestors of given version(s). This
451
        will not include the null revision.
452
2490.2.32 by Aaron Bentley
Merge of not-sorting-ancestry branch
453
        This list will not be topologically sorted if topo_sorted=False is
454
        passed.
2530.1.1 by Aaron Bentley
Make topological sorting optional for get_ancestry
455
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
456
        Must raise RevisionNotPresent if any of the given versions are
457
        not present in file history."""
458
        if isinstance(version_ids, basestring):
459
            version_ids = [version_ids]
460
        raise NotImplementedError(self.get_ancestry)
461
        
1594.2.8 by Robert Collins
add ghost aware apis to knits.
462
    def get_ancestry_with_ghosts(self, version_ids):
463
        """Return a list of all ancestors of given version(s). This
464
        will not include the null revision.
465
466
        Must raise RevisionNotPresent if any of the given versions are
467
        not present in file history.
468
        
469
        Ghosts that are known about will be included in ancestry list,
470
        but are not explicitly marked.
471
        """
472
        raise NotImplementedError(self.get_ancestry_with_ghosts)
3316.2.7 by Robert Collins
Actually deprecated VersionedFile.get_graph.
473
    
3287.5.1 by Robert Collins
Add VersionedFile.get_parent_map.
474
    def get_parent_map(self, version_ids):
475
        """Get a map of the parents of version_ids.
476
477
        :param version_ids: The version ids to look up parents for.
478
        :return: A mapping from version id to parents.
479
        """
480
        raise NotImplementedError(self.get_parent_map)
481
1594.2.8 by Robert Collins
add ghost aware apis to knits.
482
    def get_parents_with_ghosts(self, version_id):
483
        """Return version names for parents of version_id.
484
485
        Will raise RevisionNotPresent if version_id is not present
486
        in the history.
487
488
        Ghosts that are known about will be included in the parent list,
489
        but are not explicitly marked.
490
        """
3287.5.1 by Robert Collins
Add VersionedFile.get_parent_map.
491
        try:
492
            return list(self.get_parent_map([version_id])[version_id])
493
        except KeyError:
494
            raise errors.RevisionNotPresent(version_id, self)
1594.2.8 by Robert Collins
add ghost aware apis to knits.
495
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
496
    def annotate(self, version_id):
3316.2.13 by Robert Collins
* ``VersionedFile.annotate_iter`` is deprecated. While in principal this
497
        """Return a list of (version-id, line) tuples for version_id.
498
499
        :raise RevisionNotPresent: If the given version is
500
        not present in file history.
501
        """
502
        raise NotImplementedError(self.annotate)
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
503
2975.3.2 by Robert Collins
Review feedback - document the API change and improve readability in pack's _do_copy_nodes.
504
    def iter_lines_added_or_present_in_versions(self, version_ids=None,
2039.1.1 by Aaron Bentley
Clean up progress properly when interrupted during fetch (#54000)
505
                                                pb=None):
1594.2.6 by Robert Collins
Introduce a api specifically for looking at lines in some versions of the inventory, for fileid_involved.
506
        """Iterate over the lines in the versioned file from version_ids.
507
2975.3.2 by Robert Collins
Review feedback - document the API change and improve readability in pack's _do_copy_nodes.
508
        This may return lines from other versions. Each item the returned
509
        iterator yields is a tuple of a line and a text version that that line
510
        is present in (not introduced in).
511
512
        Ordering of results is in whatever order is most suitable for the
513
        underlying storage format.
1594.2.6 by Robert Collins
Introduce a api specifically for looking at lines in some versions of the inventory, for fileid_involved.
514
2039.1.1 by Aaron Bentley
Clean up progress properly when interrupted during fetch (#54000)
515
        If a progress bar is supplied, it may be used to indicate progress.
516
        The caller is responsible for cleaning up progress bars (because this
517
        is an iterator).
518
1594.2.6 by Robert Collins
Introduce a api specifically for looking at lines in some versions of the inventory, for fileid_involved.
519
        NOTES: Lines are normalised: they will all have \n terminators.
520
               Lines are returned in arbitrary order.
2975.3.2 by Robert Collins
Review feedback - document the API change and improve readability in pack's _do_copy_nodes.
521
522
        :return: An iterator over (line, version_id).
1594.2.6 by Robert Collins
Introduce a api specifically for looking at lines in some versions of the inventory, for fileid_involved.
523
        """
524
        raise NotImplementedError(self.iter_lines_added_or_present_in_versions)
525
1551.6.15 by Aaron Bentley
Moved plan_merge into Weave
526
    def plan_merge(self, ver_a, ver_b):
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
527
        """Return pseudo-annotation indicating how the two versions merge.
528
529
        This is computed between versions a and b and their common
530
        base.
531
532
        Weave lines present in none of them are skipped entirely.
1664.2.2 by Aaron Bentley
Added legend for plan-merge output
533
534
        Legend:
535
        killed-base Dead in base revision
536
        killed-both Killed in each revision
537
        killed-a    Killed in a
538
        killed-b    Killed in b
539
        unchanged   Alive in both a and b (possibly created in both)
540
        new-a       Created in a
541
        new-b       Created in b
1664.2.5 by Aaron Bentley
Update plan-merge legend
542
        ghost-a     Killed in a, unborn in b    
543
        ghost-b     Killed in b, unborn in a
1664.2.2 by Aaron Bentley
Added legend for plan-merge output
544
        irrelevant  Not in either revision
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
545
        """
1551.6.15 by Aaron Bentley
Moved plan_merge into Weave
546
        raise NotImplementedError(VersionedFile.plan_merge)
547
        
1996.3.7 by John Arbash Meinel
lazy import versionedfile, late-load bzrlib.merge
548
    def weave_merge(self, plan, a_marker=TextMerge.A_MARKER,
1551.6.14 by Aaron Bentley
Tweaks from merge review
549
                    b_marker=TextMerge.B_MARKER):
1551.6.12 by Aaron Bentley
Indicate conflicts from merge_lines, insead of guessing
550
        return PlanWeaveMerge(plan, a_marker, b_marker).merge_lines()[0]
1551.6.10 by Aaron Bentley
Renamed WeaveMerge to PlanMerge, added plan method, created planless WeaveMerge
551
1664.2.7 by Aaron Bentley
Merge bzr.dev
552
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
553
class RecordingVersionedFilesDecorator(object):
554
    """A minimal versioned files that records calls made on it.
3350.3.4 by Robert Collins
Finish adapters for annotated knits to unannotated knits and full texts.
555
    
556
    Only enough methods have been added to support tests using it to date.
557
558
    :ivar calls: A list of the calls made; can be reset at any time by
559
        assigning [] to it.
560
    """
561
562
    def __init__(self, backing_vf):
3871.4.1 by John Arbash Meinel
Add a VFDecorator that can yield records in a specified order
563
        """Create a RecordingVersionedFilesDecorator decorating backing_vf.
3350.3.4 by Robert Collins
Finish adapters for annotated knits to unannotated knits and full texts.
564
        
565
        :param backing_vf: The versioned file to answer all methods.
566
        """
567
        self._backing_vf = backing_vf
568
        self.calls = []
569
3350.8.2 by Robert Collins
stacked get_parent_map.
570
    def add_lines(self, key, parents, lines, parent_texts=None,
571
        left_matching_blocks=None, nostore_sha=None, random_id=False,
572
        check_content=True):
573
        self.calls.append(("add_lines", key, parents, lines, parent_texts,
574
            left_matching_blocks, nostore_sha, random_id, check_content))
575
        return self._backing_vf.add_lines(key, parents, lines, parent_texts,
576
            left_matching_blocks, nostore_sha, random_id, check_content)
577
3517.4.19 by Martin Pool
Update test for knit.check() to expect it to recurse into fallback vfs
578
    def check(self):
579
        self._backing_vf.check()
580
3350.8.2 by Robert Collins
stacked get_parent_map.
581
    def get_parent_map(self, keys):
582
        self.calls.append(("get_parent_map", copy(keys)))
583
        return self._backing_vf.get_parent_map(keys)
584
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
585
    def get_record_stream(self, keys, sort_order, include_delta_closure):
3350.8.7 by Robert Collins
get_record_stream for fulltexts working (but note extreme memory use!).
586
        self.calls.append(("get_record_stream", list(keys), sort_order,
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
587
            include_delta_closure))
588
        return self._backing_vf.get_record_stream(keys, sort_order,
589
            include_delta_closure)
590
3350.8.3 by Robert Collins
VF.get_sha1s needed changing to be stackable.
591
    def get_sha1s(self, keys):
592
        self.calls.append(("get_sha1s", copy(keys)))
593
        return self._backing_vf.get_sha1s(keys)
594
3350.8.5 by Robert Collins
Iter_lines_added_or_present_in_keys stacks.
595
    def iter_lines_added_or_present_in_keys(self, keys, pb=None):
596
        self.calls.append(("iter_lines_added_or_present_in_keys", copy(keys)))
3350.8.14 by Robert Collins
Review feedback.
597
        return self._backing_vf.iter_lines_added_or_present_in_keys(keys, pb=pb)
3350.8.5 by Robert Collins
Iter_lines_added_or_present_in_keys stacks.
598
3350.8.4 by Robert Collins
Vf.keys() stacking support.
599
    def keys(self):
600
        self.calls.append(("keys",))
601
        return self._backing_vf.keys()
602
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
603
3871.4.1 by John Arbash Meinel
Add a VFDecorator that can yield records in a specified order
604
class OrderingVersionedFilesDecorator(RecordingVersionedFilesDecorator):
605
    """A VF that records calls, and returns keys in specific order.
606
607
    :ivar calls: A list of the calls made; can be reset at any time by
608
        assigning [] to it.
609
    """
610
611
    def __init__(self, backing_vf, key_priority):
612
        """Create a RecordingVersionedFilesDecorator decorating backing_vf.
613
614
        :param backing_vf: The versioned file to answer all methods.
615
        :param key_priority: A dictionary defining what order keys should be
616
            returned from an 'unordered' get_record_stream request.
617
            Keys with lower priority are returned first, keys not present in
618
            the map get an implicit priority of 0, and are returned in
619
            lexicographical order.
620
        """
621
        RecordingVersionedFilesDecorator.__init__(self, backing_vf)
622
        self._key_priority = key_priority
623
624
    def get_record_stream(self, keys, sort_order, include_delta_closure):
625
        self.calls.append(("get_record_stream", list(keys), sort_order,
626
            include_delta_closure))
627
        if sort_order == 'unordered':
628
            def sort_key(key):
629
                return (self._key_priority.get(key, 0), key)
630
            # Use a defined order by asking for the keys one-by-one from the
631
            # backing_vf
632
            for key in sorted(keys, key=sort_key):
633
                for record in self._backing_vf.get_record_stream([key],
634
                                'unordered', include_delta_closure):
635
                    yield record
636
        else:
637
            for record in self._backing_vf.get_record_stream(keys, sort_order,
638
                            include_delta_closure):
639
                yield record
640
641
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
642
class KeyMapper(object):
3350.6.10 by Martin Pool
VersionedFiles review cleanups
643
    """KeyMappers map between keys and underlying partitioned storage."""
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
644
645
    def map(self, key):
646
        """Map key to an underlying storage identifier.
647
648
        :param key: A key tuple e.g. ('file-id', 'revision-id').
649
        :return: An underlying storage identifier, specific to the partitioning
650
            mechanism.
651
        """
652
        raise NotImplementedError(self.map)
653
654
    def unmap(self, partition_id):
655
        """Map a partitioned storage id back to a key prefix.
656
        
657
        :param partition_id: The underlying partition id.
3350.6.10 by Martin Pool
VersionedFiles review cleanups
658
        :return: As much of a key (or prefix) as is derivable from the partition
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
659
            id.
660
        """
661
        raise NotImplementedError(self.unmap)
662
663
664
class ConstantMapper(KeyMapper):
665
    """A key mapper that maps to a constant result."""
666
667
    def __init__(self, result):
668
        """Create a ConstantMapper which will return result for all maps."""
669
        self._result = result
670
671
    def map(self, key):
672
        """See KeyMapper.map()."""
673
        return self._result
674
675
676
class URLEscapeMapper(KeyMapper):
677
    """Base class for use with transport backed storage.
678
679
    This provides a map and unmap wrapper that respectively url escape and
680
    unescape their outputs and inputs.
681
    """
682
683
    def map(self, key):
684
        """See KeyMapper.map()."""
685
        return urllib.quote(self._map(key))
686
687
    def unmap(self, partition_id):
688
        """See KeyMapper.unmap()."""
689
        return self._unmap(urllib.unquote(partition_id))
690
691
692
class PrefixMapper(URLEscapeMapper):
693
    """A key mapper that extracts the first component of a key.
694
    
695
    This mapper is for use with a transport based backend.
696
    """
697
698
    def _map(self, key):
699
        """See KeyMapper.map()."""
700
        return key[0]
701
702
    def _unmap(self, partition_id):
703
        """See KeyMapper.unmap()."""
704
        return (partition_id,)
705
706
707
class HashPrefixMapper(URLEscapeMapper):
708
    """A key mapper that combines the first component of a key with a hash.
709
710
    This mapper is for use with a transport based backend.
711
    """
712
713
    def _map(self, key):
714
        """See KeyMapper.map()."""
715
        prefix = self._escape(key[0])
716
        return "%02x/%s" % (adler32(prefix) & 0xff, prefix)
717
718
    def _escape(self, prefix):
719
        """No escaping needed here."""
720
        return prefix
721
722
    def _unmap(self, partition_id):
723
        """See KeyMapper.unmap()."""
724
        return (self._unescape(osutils.basename(partition_id)),)
725
726
    def _unescape(self, basename):
727
        """No unescaping needed for HashPrefixMapper."""
728
        return basename
729
730
731
class HashEscapedPrefixMapper(HashPrefixMapper):
732
    """Combines the escaped first component of a key with a hash.
733
    
734
    This mapper is for use with a transport based backend.
735
    """
736
737
    _safe = "abcdefghijklmnopqrstuvwxyz0123456789-_@,."
738
739
    def _escape(self, prefix):
740
        """Turn a key element into a filesystem safe string.
741
742
        This is similar to a plain urllib.quote, except
743
        it uses specific safe characters, so that it doesn't
744
        have to translate a lot of valid file ids.
745
        """
746
        # @ does not get escaped. This is because it is a valid
747
        # filesystem character we use all the time, and it looks
748
        # a lot better than seeing %40 all the time.
749
        r = [((c in self._safe) and c or ('%%%02x' % ord(c)))
750
             for c in prefix]
751
        return ''.join(r)
752
753
    def _unescape(self, basename):
754
        """Escaped names are easily unescaped by urlutils."""
755
        return urllib.unquote(basename)
756
757
758
def make_versioned_files_factory(versioned_file_factory, mapper):
759
    """Create a ThunkedVersionedFiles factory.
760
761
    This will create a callable which when called creates a
762
    ThunkedVersionedFiles on a transport, using mapper to access individual
763
    versioned files, and versioned_file_factory to create each individual file.
764
    """
765
    def factory(transport):
766
        return ThunkedVersionedFiles(transport, versioned_file_factory, mapper,
767
            lambda:True)
768
    return factory
769
770
771
class VersionedFiles(object):
772
    """Storage for many versioned files.
773
774
    This object allows a single keyspace for accessing the history graph and
775
    contents of named bytestrings.
776
777
    Currently no implementation allows the graph of different key prefixes to
778
    intersect, but the API does allow such implementations in the future.
3350.6.7 by Robert Collins
Review feedback, making things more clear, adding documentation on what is used where.
779
780
    The keyspace is expressed via simple tuples. Any instance of VersionedFiles
781
    may have a different length key-size, but that size will be constant for
782
    all texts added to or retrieved from it. For instance, bzrlib uses
783
    instances with a key-size of 2 for storing user files in a repository, with
784
    the first element the fileid, and the second the version of that file.
785
786
    The use of tuples allows a single code base to support several different
787
    uses with only the mapping logic changing from instance to instance.
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
788
    """
789
790
    def add_lines(self, key, parents, lines, parent_texts=None,
791
        left_matching_blocks=None, nostore_sha=None, random_id=False,
792
        check_content=True):
793
        """Add a text to the store.
794
795
        :param key: The key tuple of the text to add.
796
        :param parents: The parents key tuples of the text to add.
797
        :param lines: A list of lines. Each line must be a bytestring. And all
798
            of them except the last must be terminated with \n and contain no
799
            other \n's. The last line may either contain no \n's or a single
800
            terminating \n. If the lines list does meet this constraint the add
801
            routine may error or may succeed - but you will be unable to read
802
            the data back accurately. (Checking the lines have been split
803
            correctly is expensive and extremely unlikely to catch bugs so it
804
            is not done at runtime unless check_content is True.)
805
        :param parent_texts: An optional dictionary containing the opaque 
806
            representations of some or all of the parents of version_id to
807
            allow delta optimisations.  VERY IMPORTANT: the texts must be those
808
            returned by add_lines or data corruption can be caused.
809
        :param left_matching_blocks: a hint about which areas are common
810
            between the text and its left-hand-parent.  The format is
811
            the SequenceMatcher.get_matching_blocks format.
812
        :param nostore_sha: Raise ExistingContent and do not add the lines to
813
            the versioned file if the digest of the lines matches this.
814
        :param random_id: If True a random id has been selected rather than
815
            an id determined by some deterministic process such as a converter
816
            from a foreign VCS. When True the backend may choose not to check
817
            for uniqueness of the resulting key within the versioned file, so
818
            this should only be done when the result is expected to be unique
819
            anyway.
820
        :param check_content: If True, the lines supplied are verified to be
821
            bytestrings that are correctly formed lines.
822
        :return: The text sha1, the number of bytes in the text, and an opaque
823
                 representation of the inserted version which can be provided
824
                 back to future add_lines calls in the parent_texts dictionary.
825
        """
826
        raise NotImplementedError(self.add_lines)
827
828
    def add_mpdiffs(self, records):
829
        """Add mpdiffs to this VersionedFile.
830
831
        Records should be iterables of version, parents, expected_sha1,
832
        mpdiff. mpdiff should be a MultiParent instance.
833
        """
834
        vf_parents = {}
835
        mpvf = multiparent.MultiMemoryVersionedFile()
836
        versions = []
837
        for version, parent_ids, expected_sha1, mpdiff in records:
838
            versions.append(version)
839
            mpvf.add_diff(mpdiff, version, parent_ids)
840
        needed_parents = set()
841
        for version, parent_ids, expected_sha1, mpdiff in records:
842
            needed_parents.update(p for p in parent_ids
843
                                  if not mpvf.has_version(p))
844
        # It seems likely that adding all the present parents as fulltexts can
845
        # easily exhaust memory.
3890.2.9 by John Arbash Meinel
Start using osutils.chunks_as_lines rather than osutils.split_lines.
846
        chunks_to_lines = osutils.chunks_to_lines
3350.8.11 by Robert Collins
Stacked add_mpdiffs.
847
        for record in self.get_record_stream(needed_parents, 'unordered',
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
848
            True):
3350.8.11 by Robert Collins
Stacked add_mpdiffs.
849
            if record.storage_kind == 'absent':
850
                continue
3890.2.9 by John Arbash Meinel
Start using osutils.chunks_as_lines rather than osutils.split_lines.
851
            mpvf.add_version(chunks_to_lines(record.get_bytes_as('chunked')),
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
852
                record.key, [])
853
        for (key, parent_keys, expected_sha1, mpdiff), lines in\
854
            zip(records, mpvf.get_line_list(versions)):
855
            if len(parent_keys) == 1:
856
                left_matching_blocks = list(mpdiff.get_matching_blocks(0,
857
                    mpvf.get_diff(parent_keys[0]).num_lines()))
858
            else:
859
                left_matching_blocks = None
860
            version_sha1, _, version_text = self.add_lines(key,
861
                parent_keys, lines, vf_parents,
862
                left_matching_blocks=left_matching_blocks)
863
            if version_sha1 != expected_sha1:
864
                raise errors.VersionedFileInvalidChecksum(version)
865
            vf_parents[key] = version_text
866
867
    def annotate(self, key):
868
        """Return a list of (version-key, line) tuples for the text of key.
869
870
        :raise RevisionNotPresent: If the key is not present.
871
        """
872
        raise NotImplementedError(self.annotate)
873
874
    def check(self, progress_bar=None):
875
        """Check this object for integrity."""
876
        raise NotImplementedError(self.check)
877
878
    @staticmethod
879
    def check_not_reserved_id(version_id):
880
        revision.check_not_reserved_id(version_id)
881
882
    def _check_lines_not_unicode(self, lines):
883
        """Check that lines being added to a versioned file are not unicode."""
884
        for line in lines:
885
            if line.__class__ is not str:
886
                raise errors.BzrBadParameterUnicode("lines")
887
888
    def _check_lines_are_lines(self, lines):
889
        """Check that the lines really are full lines without inline EOL."""
890
        for line in lines:
891
            if '\n' in line[:-1]:
892
                raise errors.BzrBadParameterContainsNewline("lines")
893
894
    def get_parent_map(self, keys):
895
        """Get a map of the parents of keys.
896
897
        :param keys: The keys to look up parents for.
898
        :return: A mapping from keys to parents. Absent keys are absent from
899
            the mapping.
900
        """
901
        raise NotImplementedError(self.get_parent_map)
902
903
    def get_record_stream(self, keys, ordering, include_delta_closure):
904
        """Get a stream of records for keys.
905
906
        :param keys: The keys to include.
907
        :param ordering: Either 'unordered' or 'topological'. A topologically
908
            sorted stream has compression parents strictly before their
909
            children.
910
        :param include_delta_closure: If True then the closure across any
911
            compression parents will be included (in the opaque data).
912
        :return: An iterator of ContentFactory objects, each of which is only
913
            valid until the iterator is advanced.
914
        """
915
        raise NotImplementedError(self.get_record_stream)
916
917
    def get_sha1s(self, keys):
918
        """Get the sha1's of the texts for the given keys.
919
920
        :param keys: The names of the keys to lookup
3350.8.3 by Robert Collins
VF.get_sha1s needed changing to be stackable.
921
        :return: a dict from key to sha1 digest. Keys of texts which are not
3350.8.14 by Robert Collins
Review feedback.
922
            present in the store are not present in the returned
3350.8.3 by Robert Collins
VF.get_sha1s needed changing to be stackable.
923
            dictionary.
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
924
        """
925
        raise NotImplementedError(self.get_sha1s)
926
3830.3.12 by Martin Pool
Review cleanups: unify has_key impls, add missing_keys(), clean up exception blocks
927
    has_key = index._has_key_from_parent_map
928
4009.3.3 by Andrew Bennetts
Add docstrings.
929
    def get_missing_compression_parent_keys(self):
930
        """Return an iterable of keys of missing compression parents.
931
932
        Check this after calling insert_record_stream to find out if there are
933
        any missing compression parents.  If there are, the records that
4009.3.12 by Robert Collins
Polish on inserting record streams with missing compression parents.
934
        depend on them are not able to be inserted safely. The precise
935
        behaviour depends on the concrete VersionedFiles class in use.
936
937
        Classes that do not support this will raise NotImplementedError.
4009.3.3 by Andrew Bennetts
Add docstrings.
938
        """
939
        raise NotImplementedError(self.get_missing_compression_parent_keys)
940
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
941
    def insert_record_stream(self, stream):
942
        """Insert a record stream into this container.
943
944
        :param stream: A stream of records to insert. 
945
        :return: None
946
        :seealso VersionedFile.get_record_stream:
947
        """
948
        raise NotImplementedError
949
950
    def iter_lines_added_or_present_in_keys(self, keys, pb=None):
951
        """Iterate over the lines in the versioned files from keys.
952
953
        This may return lines from other keys. Each item the returned
954
        iterator yields is a tuple of a line and a text version that that line
955
        is present in (not introduced in).
956
957
        Ordering of results is in whatever order is most suitable for the
958
        underlying storage format.
959
960
        If a progress bar is supplied, it may be used to indicate progress.
961
        The caller is responsible for cleaning up progress bars (because this
962
        is an iterator).
963
964
        NOTES:
965
         * Lines are normalised by the underlying store: they will all have \n
966
           terminators.
967
         * Lines are returned in arbitrary order.
968
969
        :return: An iterator over (line, key).
970
        """
971
        raise NotImplementedError(self.iter_lines_added_or_present_in_keys)
972
973
    def keys(self):
974
        """Return a iterable of the keys for all the contained texts."""
975
        raise NotImplementedError(self.keys)
976
977
    def make_mpdiffs(self, keys):
978
        """Create multiparent diffs for specified keys."""
979
        keys_order = tuple(keys)
980
        keys = frozenset(keys)
981
        knit_keys = set(keys)
982
        parent_map = self.get_parent_map(keys)
983
        for parent_keys in parent_map.itervalues():
984
            if parent_keys:
985
                knit_keys.update(parent_keys)
986
        missing_keys = keys - set(parent_map)
987
        if missing_keys:
3530.3.2 by Robert Collins
Handling frozen set inputs in mpdiff generation when a key is missing
988
            raise errors.RevisionNotPresent(list(missing_keys)[0], self)
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
989
        # We need to filter out ghosts, because we can't diff against them.
990
        maybe_ghosts = knit_keys - keys
991
        ghosts = maybe_ghosts - set(self.get_parent_map(maybe_ghosts))
992
        knit_keys.difference_update(ghosts)
993
        lines = {}
3890.2.9 by John Arbash Meinel
Start using osutils.chunks_as_lines rather than osutils.split_lines.
994
        chunks_to_lines = osutils.chunks_to_lines
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
995
        for record in self.get_record_stream(knit_keys, 'topological', True):
3890.2.9 by John Arbash Meinel
Start using osutils.chunks_as_lines rather than osutils.split_lines.
996
            lines[record.key] = chunks_to_lines(record.get_bytes_as('chunked'))
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
997
            # line_block_dict = {}
998
            # for parent, blocks in record.extract_line_blocks():
999
            #   line_blocks[parent] = blocks
1000
            # line_blocks[record.key] = line_block_dict
1001
        diffs = []
1002
        for key in keys_order:
1003
            target = lines[key]
1004
            parents = parent_map[key] or []
1005
            # Note that filtering knit_keys can lead to a parent difference
1006
            # between the creation and the application of the mpdiff.
1007
            parent_lines = [lines[p] for p in parents if p in knit_keys]
1008
            if len(parent_lines) > 0:
1009
                left_parent_blocks = self._extract_blocks(key, parent_lines[0],
1010
                    target)
1011
            else:
1012
                left_parent_blocks = None
1013
            diffs.append(multiparent.MultiParent.from_lines(target,
1014
                parent_lines, left_parent_blocks))
1015
        return diffs
1016
3830.3.12 by Martin Pool
Review cleanups: unify has_key impls, add missing_keys(), clean up exception blocks
1017
    missing_keys = index._missing_keys_from_parent_map
1018
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1019
    def _extract_blocks(self, version_id, source, target):
1020
        return None
1021
1022
1023
class ThunkedVersionedFiles(VersionedFiles):
1024
    """Storage for many versioned files thunked onto a 'VersionedFile' class.
1025
1026
    This object allows a single keyspace for accessing the history graph and
1027
    contents of named bytestrings.
1028
1029
    Currently no implementation allows the graph of different key prefixes to
1030
    intersect, but the API does allow such implementations in the future.
1031
    """
1032
1033
    def __init__(self, transport, file_factory, mapper, is_locked):
1034
        """Create a ThunkedVersionedFiles."""
1035
        self._transport = transport
1036
        self._file_factory = file_factory
1037
        self._mapper = mapper
1038
        self._is_locked = is_locked
1039
1040
    def add_lines(self, key, parents, lines, parent_texts=None,
1041
        left_matching_blocks=None, nostore_sha=None, random_id=False,
1042
        check_content=True):
1043
        """See VersionedFiles.add_lines()."""
1044
        path = self._mapper.map(key)
1045
        version_id = key[-1]
1046
        parents = [parent[-1] for parent in parents]
1047
        vf = self._get_vf(path)
1048
        try:
1049
            try:
1050
                return vf.add_lines_with_ghosts(version_id, parents, lines,
1051
                    parent_texts=parent_texts,
1052
                    left_matching_blocks=left_matching_blocks,
1053
                    nostore_sha=nostore_sha, random_id=random_id,
1054
                    check_content=check_content)
1055
            except NotImplementedError:
1056
                return vf.add_lines(version_id, parents, lines,
1057
                    parent_texts=parent_texts,
1058
                    left_matching_blocks=left_matching_blocks,
1059
                    nostore_sha=nostore_sha, random_id=random_id,
1060
                    check_content=check_content)
1061
        except errors.NoSuchFile:
1062
            # parent directory may be missing, try again.
1063
            self._transport.mkdir(osutils.dirname(path))
1064
            try:
1065
                return vf.add_lines_with_ghosts(version_id, parents, lines,
1066
                    parent_texts=parent_texts,
1067
                    left_matching_blocks=left_matching_blocks,
1068
                    nostore_sha=nostore_sha, random_id=random_id,
1069
                    check_content=check_content)
1070
            except NotImplementedError:
1071
                return vf.add_lines(version_id, parents, lines,
1072
                    parent_texts=parent_texts,
1073
                    left_matching_blocks=left_matching_blocks,
1074
                    nostore_sha=nostore_sha, random_id=random_id,
1075
                    check_content=check_content)
1076
1077
    def annotate(self, key):
1078
        """Return a list of (version-key, line) tuples for the text of key.
1079
1080
        :raise RevisionNotPresent: If the key is not present.
1081
        """
1082
        prefix = key[:-1]
1083
        path = self._mapper.map(prefix)
1084
        vf = self._get_vf(path)
1085
        origins = vf.annotate(key[-1])
1086
        result = []
1087
        for origin, line in origins:
1088
            result.append((prefix + (origin,), line))
1089
        return result
1090
1091
    def check(self, progress_bar=None):
1092
        """See VersionedFiles.check()."""
1093
        for prefix, vf in self._iter_all_components():
1094
            vf.check()
1095
1096
    def get_parent_map(self, keys):
1097
        """Get a map of the parents of keys.
1098
1099
        :param keys: The keys to look up parents for.
1100
        :return: A mapping from keys to parents. Absent keys are absent from
1101
            the mapping.
1102
        """
1103
        prefixes = self._partition_keys(keys)
1104
        result = {}
1105
        for prefix, suffixes in prefixes.items():
1106
            path = self._mapper.map(prefix)
1107
            vf = self._get_vf(path)
1108
            parent_map = vf.get_parent_map(suffixes)
1109
            for key, parents in parent_map.items():
1110
                result[prefix + (key,)] = tuple(
1111
                    prefix + (parent,) for parent in parents)
1112
        return result
1113
1114
    def _get_vf(self, path):
1115
        if not self._is_locked():
1116
            raise errors.ObjectNotLocked(self)
1117
        return self._file_factory(path, self._transport, create=True,
1118
            get_scope=lambda:None)
1119
1120
    def _partition_keys(self, keys):
1121
        """Turn keys into a dict of prefix:suffix_list."""
1122
        result = {}
1123
        for key in keys:
1124
            prefix_keys = result.setdefault(key[:-1], [])
1125
            prefix_keys.append(key[-1])
1126
        return result
1127
1128
    def _get_all_prefixes(self):
1129
        # Identify all key prefixes.
1130
        # XXX: A bit hacky, needs polish.
1131
        if type(self._mapper) == ConstantMapper:
1132
            paths = [self._mapper.map(())]
1133
            prefixes = [()]
1134
        else:
1135
            relpaths = set()
1136
            for quoted_relpath in self._transport.iter_files_recursive():
1137
                path, ext = os.path.splitext(quoted_relpath)
1138
                relpaths.add(path)
1139
            paths = list(relpaths)
1140
            prefixes = [self._mapper.unmap(path) for path in paths]
1141
        return zip(paths, prefixes)
1142
1143
    def get_record_stream(self, keys, ordering, include_delta_closure):
1144
        """See VersionedFiles.get_record_stream()."""
1145
        # Ordering will be taken care of by each partitioned store; group keys
1146
        # by partition.
1147
        keys = sorted(keys)
1148
        for prefix, suffixes, vf in self._iter_keys_vf(keys):
1149
            suffixes = [(suffix,) for suffix in suffixes]
1150
            for record in vf.get_record_stream(suffixes, ordering,
1151
                include_delta_closure):
1152
                if record.parents is not None:
1153
                    record.parents = tuple(
1154
                        prefix + parent for parent in record.parents)
1155
                record.key = prefix + record.key
1156
                yield record
1157
1158
    def _iter_keys_vf(self, keys):
1159
        prefixes = self._partition_keys(keys)
1160
        sha1s = {}
1161
        for prefix, suffixes in prefixes.items():
1162
            path = self._mapper.map(prefix)
1163
            vf = self._get_vf(path)
1164
            yield prefix, suffixes, vf
1165
1166
    def get_sha1s(self, keys):
1167
        """See VersionedFiles.get_sha1s()."""
1168
        sha1s = {}
1169
        for prefix,suffixes, vf in self._iter_keys_vf(keys):
1170
            vf_sha1s = vf.get_sha1s(suffixes)
3350.8.3 by Robert Collins
VF.get_sha1s needed changing to be stackable.
1171
            for suffix, sha1 in vf_sha1s.iteritems():
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1172
                sha1s[prefix + (suffix,)] = sha1
3350.8.3 by Robert Collins
VF.get_sha1s needed changing to be stackable.
1173
        return sha1s
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1174
1175
    def insert_record_stream(self, stream):
1176
        """Insert a record stream into this container.
1177
1178
        :param stream: A stream of records to insert. 
1179
        :return: None
1180
        :seealso VersionedFile.get_record_stream:
1181
        """
1182
        for record in stream:
1183
            prefix = record.key[:-1]
1184
            key = record.key[-1:]
1185
            if record.parents is not None:
1186
                parents = [parent[-1:] for parent in record.parents]
1187
            else:
1188
                parents = None
1189
            thunk_record = AdapterFactory(key, parents, record)
1190
            path = self._mapper.map(prefix)
1191
            # Note that this parses the file many times; we can do better but
1192
            # as this only impacts weaves in terms of performance, it is
1193
            # tolerable.
1194
            vf = self._get_vf(path)
1195
            vf.insert_record_stream([thunk_record])
1196
1197
    def iter_lines_added_or_present_in_keys(self, keys, pb=None):
1198
        """Iterate over the lines in the versioned files from keys.
1199
1200
        This may return lines from other keys. Each item the returned
1201
        iterator yields is a tuple of a line and a text version that that line
1202
        is present in (not introduced in).
1203
1204
        Ordering of results is in whatever order is most suitable for the
1205
        underlying storage format.
1206
1207
        If a progress bar is supplied, it may be used to indicate progress.
1208
        The caller is responsible for cleaning up progress bars (because this
1209
        is an iterator).
1210
1211
        NOTES:
1212
         * Lines are normalised by the underlying store: they will all have \n
1213
           terminators.
1214
         * Lines are returned in arbitrary order.
1215
1216
        :return: An iterator over (line, key).
1217
        """
1218
        for prefix, suffixes, vf in self._iter_keys_vf(keys):
1219
            for line, version in vf.iter_lines_added_or_present_in_versions(suffixes):
1220
                yield line, prefix + (version,)
1221
1222
    def _iter_all_components(self):
1223
        for path, prefix in self._get_all_prefixes():
1224
            yield prefix, self._get_vf(path)
1225
1226
    def keys(self):
1227
        """See VersionedFiles.keys()."""
1228
        result = set()
1229
        for prefix, vf in self._iter_all_components():
1230
            for suffix in vf.versions():
1231
                result.add(prefix + (suffix,))
1232
        return result
1233
1234
1235
class _PlanMergeVersionedFile(VersionedFiles):
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1236
    """A VersionedFile for uncommitted and committed texts.
1237
1238
    It is intended to allow merges to be planned with working tree texts.
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1239
    It implements only the small part of the VersionedFiles interface used by
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1240
    PlanMerge.  It falls back to multiple versionedfiles for data not stored in
1241
    _PlanMergeVersionedFile itself.
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1242
1243
    :ivar: fallback_versionedfiles a list of VersionedFiles objects that can be
1244
        queried for missing texts.
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1245
    """
1246
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1247
    def __init__(self, file_id):
1248
        """Create a _PlanMergeVersionedFile.
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1249
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1250
        :param file_id: Used with _PlanMerge code which is not yet fully
1251
            tuple-keyspace aware.
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1252
        """
1253
        self._file_id = file_id
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1254
        # fallback locations
1255
        self.fallback_versionedfiles = []
1256
        # Parents for locally held keys.
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1257
        self._parents = {}
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1258
        # line data for locally held keys.
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1259
        self._lines = {}
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1260
        # key lookup providers
1261
        self._providers = [DictParentsProvider(self._parents)]
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1262
3062.2.3 by Aaron Bentley
Sync up with bzr.dev API changes
1263
    def plan_merge(self, ver_a, ver_b, base=None):
3062.1.13 by Aaron Bentley
Make _PlanMerge an implementation detail of _PlanMergeVersionedFile
1264
        """See VersionedFile.plan_merge"""
3144.3.7 by Aaron Bentley
Update from review
1265
        from bzrlib.merge import _PlanMerge
3062.2.3 by Aaron Bentley
Sync up with bzr.dev API changes
1266
        if base is None:
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1267
            return _PlanMerge(ver_a, ver_b, self, (self._file_id,)).plan_merge()
1268
        old_plan = list(_PlanMerge(ver_a, base, self, (self._file_id,)).plan_merge())
1269
        new_plan = list(_PlanMerge(ver_a, ver_b, self, (self._file_id,)).plan_merge())
3062.2.3 by Aaron Bentley
Sync up with bzr.dev API changes
1270
        return _PlanMerge._subtract_plans(old_plan, new_plan)
1271
3144.3.1 by Aaron Bentley
Implement LCA merge, with problematic conflict markers
1272
    def plan_lca_merge(self, ver_a, ver_b, base=None):
3144.3.7 by Aaron Bentley
Update from review
1273
        from bzrlib.merge import _PlanLCAMerge
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1274
        graph = Graph(self)
1275
        new_plan = _PlanLCAMerge(ver_a, ver_b, self, (self._file_id,), graph).plan_merge()
3144.3.1 by Aaron Bentley
Implement LCA merge, with problematic conflict markers
1276
        if base is None:
1277
            return new_plan
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1278
        old_plan = _PlanLCAMerge(ver_a, base, self, (self._file_id,), graph).plan_merge()
3144.3.1 by Aaron Bentley
Implement LCA merge, with problematic conflict markers
1279
        return _PlanLCAMerge._subtract_plans(list(old_plan), list(new_plan))
3062.1.13 by Aaron Bentley
Make _PlanMerge an implementation detail of _PlanMergeVersionedFile
1280
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1281
    def add_lines(self, key, parents, lines):
1282
        """See VersionedFiles.add_lines
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1283
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1284
        Lines are added locally, not to fallback versionedfiles.  Also, ghosts
1285
        are permitted.  Only reserved ids are permitted.
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1286
        """
3350.6.8 by Martin Pool
Change stray pdb calls to exceptions
1287
        if type(key) is not tuple:
1288
            raise TypeError(key)
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1289
        if not revision.is_reserved_id(key[-1]):
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1290
            raise ValueError('Only reserved ids may be used')
1291
        if parents is None:
1292
            raise ValueError('Parents may not be None')
1293
        if lines is None:
1294
            raise ValueError('Lines may not be None')
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1295
        self._parents[key] = tuple(parents)
1296
        self._lines[key] = lines
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1297
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1298
    def get_record_stream(self, keys, ordering, include_delta_closure):
1299
        pending = set(keys)
1300
        for key in keys:
1301
            if key in self._lines:
1302
                lines = self._lines[key]
1303
                parents = self._parents[key]
1304
                pending.remove(key)
3890.2.1 by John Arbash Meinel
Start working on a ChunkedContentFactory.
1305
                yield ChunkedContentFactory(key, parents, None, lines)
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1306
        for versionedfile in self.fallback_versionedfiles:
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1307
            for record in versionedfile.get_record_stream(
1308
                pending, 'unordered', True):
1309
                if record.storage_kind == 'absent':
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1310
                    continue
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1311
                else:
1312
                    pending.remove(record.key)
1313
                    yield record
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
1314
            if not pending:
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1315
                return
1316
        # report absent entries
1317
        for key in pending:
1318
            yield AbsentContentFactory(key)
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1319
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1320
    def get_parent_map(self, keys):
1321
        """See VersionedFiles.get_parent_map"""
1322
        # We create a new provider because a fallback may have been added.
1323
        # If we make fallbacks private we can update a stack list and avoid
1324
        # object creation thrashing.
3350.6.6 by Robert Collins
Fix test_plan_file_merge
1325
        keys = set(keys)
1326
        result = {}
1327
        if revision.NULL_REVISION in keys:
1328
            keys.remove(revision.NULL_REVISION)
1329
            result[revision.NULL_REVISION] = ()
3350.6.4 by Robert Collins
First cut at pluralised VersionedFiles. Some rather massive API incompatabilities, primarily because of the difficulty of coherence among competing stores.
1330
        self._providers = self._providers[:1] + self.fallback_versionedfiles
3350.6.6 by Robert Collins
Fix test_plan_file_merge
1331
        result.update(
1332
            _StackedParentsProvider(self._providers).get_parent_map(keys))
3350.6.5 by Robert Collins
Update to bzr.dev.
1333
        for key, parents in result.iteritems():
1334
            if parents == ():
1335
                result[key] = (revision.NULL_REVISION,)
3287.5.2 by Robert Collins
Deprecate VersionedFile.get_parents, breaking pulling from a ghost containing knit or pack repository to weaves, which improves correctness and allows simplification of core code.
1336
        return result
3144.3.1 by Aaron Bentley
Implement LCA merge, with problematic conflict markers
1337
3062.1.9 by Aaron Bentley
Move PlanMerge into merge and _PlanMergeVersionedFile into versionedfile
1338
1551.6.10 by Aaron Bentley
Renamed WeaveMerge to PlanMerge, added plan method, created planless WeaveMerge
1339
class PlanWeaveMerge(TextMerge):
1551.6.13 by Aaron Bentley
Cleanup
1340
    """Weave merge that takes a plan as its input.
1341
    
1551.6.14 by Aaron Bentley
Tweaks from merge review
1342
    This exists so that VersionedFile.plan_merge is implementable.
1343
    Most callers will want to use WeaveMerge instead.
1551.6.13 by Aaron Bentley
Cleanup
1344
    """
1345
1551.6.14 by Aaron Bentley
Tweaks from merge review
1346
    def __init__(self, plan, a_marker=TextMerge.A_MARKER,
1347
                 b_marker=TextMerge.B_MARKER):
1551.6.10 by Aaron Bentley
Renamed WeaveMerge to PlanMerge, added plan method, created planless WeaveMerge
1348
        TextMerge.__init__(self, a_marker, b_marker)
1349
        self.plan = plan
1350
1551.6.7 by Aaron Bentley
Implemented two-way merge, refactored weave merge
1351
    def _merge_struct(self):
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
1352
        lines_a = []
1353
        lines_b = []
1354
        ch_a = ch_b = False
1664.2.8 by Aaron Bentley
Fix WeaveMerge when plan doesn't end with unchanged lines
1355
1356
        def outstanding_struct():
1357
            if not lines_a and not lines_b:
1358
                return
1359
            elif ch_a and not ch_b:
1360
                # one-sided change:
1361
                yield(lines_a,)
1362
            elif ch_b and not ch_a:
1363
                yield (lines_b,)
1364
            elif lines_a == lines_b:
1365
                yield(lines_a,)
1366
            else:
1367
                yield (lines_a, lines_b)
1551.6.13 by Aaron Bentley
Cleanup
1368
       
1616.1.18 by Martin Pool
(weave-merge) don't treat killed-both lines as points of agreement;
1369
        # We previously considered either 'unchanged' or 'killed-both' lines
1370
        # to be possible places to resynchronize.  However, assuming agreement
1759.2.1 by Jelmer Vernooij
Fix some types (found using aspell).
1371
        # on killed-both lines may be too aggressive. -- mbp 20060324
1551.6.7 by Aaron Bentley
Implemented two-way merge, refactored weave merge
1372
        for state, line in self.plan:
1616.1.18 by Martin Pool
(weave-merge) don't treat killed-both lines as points of agreement;
1373
            if state == 'unchanged':
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
1374
                # resync and flush queued conflicts changes if any
1664.2.8 by Aaron Bentley
Fix WeaveMerge when plan doesn't end with unchanged lines
1375
                for struct in outstanding_struct():
1376
                    yield struct
1551.6.11 by Aaron Bentley
Switched TextMerge_lines to work on a list
1377
                lines_a = []
1378
                lines_b = []
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
1379
                ch_a = ch_b = False
1380
                
1381
            if state == 'unchanged':
1382
                if line:
1551.6.5 by Aaron Bentley
Got weave merge producing structural output
1383
                    yield ([line],)
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
1384
            elif state == 'killed-a':
1385
                ch_a = True
1386
                lines_b.append(line)
1387
            elif state == 'killed-b':
1388
                ch_b = True
1389
                lines_a.append(line)
1390
            elif state == 'new-a':
1391
                ch_a = True
1392
                lines_a.append(line)
1393
            elif state == 'new-b':
1394
                ch_b = True
1395
                lines_b.append(line)
3144.3.2 by Aaron Bentley
Get conflict handling working
1396
            elif state == 'conflicted-a':
1397
                ch_b = ch_a = True
1398
                lines_a.append(line)
1399
            elif state == 'conflicted-b':
1400
                ch_b = ch_a = True
1401
                lines_b.append(line)
1563.2.1 by Robert Collins
Merge in a variation of the versionedfile api from versioned-file.
1402
            else:
3376.2.4 by Martin Pool
Remove every assert statement from bzrlib!
1403
                if state not in ('irrelevant', 'ghost-a', 'ghost-b',
1404
                        'killed-base', 'killed-both'):
1405
                    raise AssertionError(state)
1664.2.8 by Aaron Bentley
Fix WeaveMerge when plan doesn't end with unchanged lines
1406
        for struct in outstanding_struct():
1407
            yield struct
1563.2.12 by Robert Collins
Checkpointing: created InterObject to factor out common inter object worker code, added InterVersionedFile and tests to allow making join work between any versionedfile.
1408
1664.2.14 by Aaron Bentley
spacing fix
1409
1551.6.10 by Aaron Bentley
Renamed WeaveMerge to PlanMerge, added plan method, created planless WeaveMerge
1410
class WeaveMerge(PlanWeaveMerge):
2831.7.1 by Ian Clatworthy
versionedfile.py code cleanups
1411
    """Weave merge that takes a VersionedFile and two versions as its input."""
1551.6.13 by Aaron Bentley
Cleanup
1412
1551.6.10 by Aaron Bentley
Renamed WeaveMerge to PlanMerge, added plan method, created planless WeaveMerge
1413
    def __init__(self, versionedfile, ver_a, ver_b, 
1551.6.14 by Aaron Bentley
Tweaks from merge review
1414
        a_marker=PlanWeaveMerge.A_MARKER, b_marker=PlanWeaveMerge.B_MARKER):
1551.6.15 by Aaron Bentley
Moved plan_merge into Weave
1415
        plan = versionedfile.plan_merge(ver_a, ver_b)
1551.6.10 by Aaron Bentley
Renamed WeaveMerge to PlanMerge, added plan method, created planless WeaveMerge
1416
        PlanWeaveMerge.__init__(self, plan, a_marker, b_marker)
1417
1418
3518.1.1 by Jelmer Vernooij
Add VirtualVersionedFiles class.
1419
class VirtualVersionedFiles(VersionedFiles):
1420
    """Dummy implementation for VersionedFiles that uses other functions for 
1421
    obtaining fulltexts and parent maps.
1422
1423
    This is always on the bottom of the stack and uses string keys 
1424
    (rather than tuples) internally.
1425
    """
1426
1427
    def __init__(self, get_parent_map, get_lines):
1428
        """Create a VirtualVersionedFiles.
1429
1430
        :param get_parent_map: Same signature as Repository.get_parent_map.
1431
        :param get_lines: Should return lines for specified key or None if 
1432
                          not available.
1433
        """
1434
        super(VirtualVersionedFiles, self).__init__()
1435
        self._get_parent_map = get_parent_map
1436
        self._get_lines = get_lines
1437
        
1438
    def check(self, progressbar=None):
1439
        """See VersionedFiles.check.
1440
1441
        :note: Always returns True for VirtualVersionedFiles.
1442
        """
1443
        return True
1444
1445
    def add_mpdiffs(self, records):
1446
        """See VersionedFiles.mpdiffs.
1447
1448
        :note: Not implemented for VirtualVersionedFiles.
1449
        """
1450
        raise NotImplementedError(self.add_mpdiffs)
1451
1452
    def get_parent_map(self, keys):
1453
        """See VersionedFiles.get_parent_map."""
3518.1.2 by Jelmer Vernooij
Fix some stylistic issues pointed out by Ian.
1454
        return dict([((k,), tuple([(p,) for p in v]))
1455
            for k,v in self._get_parent_map([k for (k,) in keys]).iteritems()])
3518.1.1 by Jelmer Vernooij
Add VirtualVersionedFiles class.
1456
1457
    def get_sha1s(self, keys):
1458
        """See VersionedFiles.get_sha1s."""
1459
        ret = {}
1460
        for (k,) in keys:
1461
            lines = self._get_lines(k)
1462
            if lines is not None:
3518.1.2 by Jelmer Vernooij
Fix some stylistic issues pointed out by Ian.
1463
                if not isinstance(lines, list):
1464
                    raise AssertionError
3518.1.1 by Jelmer Vernooij
Add VirtualVersionedFiles class.
1465
                ret[(k,)] = osutils.sha_strings(lines)
1466
        return ret
1467
1468
    def get_record_stream(self, keys, ordering, include_delta_closure):
1469
        """See VersionedFiles.get_record_stream."""
1470
        for (k,) in list(keys):
1471
            lines = self._get_lines(k)
1472
            if lines is not None:
3518.1.2 by Jelmer Vernooij
Fix some stylistic issues pointed out by Ian.
1473
                if not isinstance(lines, list):
1474
                    raise AssertionError
3890.2.1 by John Arbash Meinel
Start working on a ChunkedContentFactory.
1475
                yield ChunkedContentFactory((k,), None,
3518.1.1 by Jelmer Vernooij
Add VirtualVersionedFiles class.
1476
                        sha1=osutils.sha_strings(lines),
3890.2.1 by John Arbash Meinel
Start working on a ChunkedContentFactory.
1477
                        chunks=lines)
3518.1.1 by Jelmer Vernooij
Add VirtualVersionedFiles class.
1478
            else:
1479
                yield AbsentContentFactory((k,))
1480
3949.4.1 by Jelmer Vernooij
Implement VirtualVersionedFiles.iter_lines_added_or_present_in_keys.
1481
    def iter_lines_added_or_present_in_keys(self, keys, pb=None):
1482
        """See VersionedFile.iter_lines_added_or_present_in_versions()."""
1483
        for i, (key,) in enumerate(keys):
1484
            if pb is not None:
1485
                pb.update("iterating texts", i, len(keys))
1486
            for l in self._get_lines(key):
1487
                yield (l, key)
4005.3.2 by Robert Collins
First passing NetworkRecordStream test - a fulltext from any record type which isn't a chunked or fulltext can be serialised and deserialised successfully.
1488
1489
1490
def network_bytes_to_kind_and_offset(network_bytes):
1491
    """Strip of a record kind from the front of network_bytes.
1492
1493
    :param network_bytes: The bytes of a record.
1494
    :return: A tuple (storage_kind, offset_of_remaining_bytes)
1495
    """
1496
    line_end = network_bytes.find('\n')
1497
    storage_kind = network_bytes[:line_end]
1498
    return storage_kind, line_end + 1
1499
1500
1501
class NetworkRecordStream(object):
1502
    """A record_stream which reconstitures a serialised stream."""
1503
1504
    def __init__(self, bytes_iterator):
1505
        """Create a NetworkRecordStream.
1506
1507
        :param bytes_iterator: An iterator of bytes. Each item in this
1508
            iterator should have been obtained from a record_streams'
1509
            record.get_bytes_as(record.storage_kind) call.
1510
        """
1511
        self._bytes_iterator = bytes_iterator
1512
        self._kind_factory = {'knit-ft-gz':knit.knit_network_to_record,
4005.3.3 by Robert Collins
Test NetworkRecordStream with delta'd texts.
1513
            'knit-delta-gz':knit.knit_network_to_record,
4005.3.2 by Robert Collins
First passing NetworkRecordStream test - a fulltext from any record type which isn't a chunked or fulltext can be serialised and deserialised successfully.
1514
            'knit-annotated-ft-gz':knit.knit_network_to_record,
4005.3.3 by Robert Collins
Test NetworkRecordStream with delta'd texts.
1515
            'knit-annotated-delta-gz':knit.knit_network_to_record,
4005.3.6 by Robert Collins
Support delta_closure=True with NetworkRecordStream to transmit deltas over the wire when full text extraction is required on the far end.
1516
            'knit-delta-closure':knit.knit_delta_closure_to_records,
4005.3.2 by Robert Collins
First passing NetworkRecordStream test - a fulltext from any record type which isn't a chunked or fulltext can be serialised and deserialised successfully.
1517
            }
1518
1519
    def read(self):
1520
        """Read the stream.
1521
1522
        :return: An iterator as per VersionedFiles.get_record_stream().
1523
        """
1524
        for bytes in self._bytes_iterator:
1525
            storage_kind, line_end = network_bytes_to_kind_and_offset(bytes)
4005.3.6 by Robert Collins
Support delta_closure=True with NetworkRecordStream to transmit deltas over the wire when full text extraction is required on the far end.
1526
            for record in self._kind_factory[storage_kind](
1527
                storage_kind, bytes, line_end):
1528
                yield record