Apache SINGA
A distributed deep learning platform .
 All Classes Namespaces Files Functions Variables Typedefs Enumerator Macros
Public Member Functions | Public Attributes | Protected Attributes | List of all members
Blob< Dtype > Class Template Reference

Public Member Functions

 Blob (const vector< int > &shape)
 
void Reshape (const vector< int > &shape)
 Change the dimensions of the blob, allocating new memory if necessary. More...
 
void ReshapeLike (const Blob &other)
 
const vector< int > & shape () const
 
int count () const
 
void set_version (int v)
 
const int version () const
 
void CopyFrom (const Blob< Dtype > &source, bool reshape=false)
 Copy from a source Blob. More...
 
const shared_ptr< SyncedMemory > & data () const
 
const Dtype * cpu_data () const
 
void set_cpu_data (Dtype *data)
 
const Dtype * gpu_data () const
 
Dtype * mutable_cpu_data ()
 
Dtype * mutable_gpu_data ()
 
void ToProto (singa::BlobProto *proto) const
 
Dtype asum_data () const
 Compute the sum of absolute values (L1 norm) of the data.
 
Dtype sum_data () const
 
void ShareData (const Blob &other)
 Set the data_ shared_ptr to point to the SyncedMemory holding the data_ of Blob other – useful in Layer&s which simply perform a copy in their Forward pass. More...
 
void Swap (Blob &other)
 

Public Attributes

shared_ptr< SyncedMemorydata_
 

Protected Attributes

vector< int > shape_
 
int count_
 
int capacity_
 
int version_
 

Member Function Documentation

template<typename Dtype>
void Blob< Dtype >::CopyFrom ( const Blob< Dtype > &  source,
bool  reshape = false 
)

Copy from a source Blob.

Parameters
sourcethe Blob to copy from
reshapeif false, require this Blob to be pre-shaped to the shape of other (and die otherwise); if true, Reshape this Blob to other's shape if necessary
template<typename Dtype>
void Blob< Dtype >::Reshape ( const vector< int > &  shape)

Change the dimensions of the blob, allocating new memory if necessary.

This function can be called both to create an initial allocation of memory, and to adjust the dimensions of a top blob during Layer::Reshape or Layer::Forward. When changing the size of blob, memory will only be reallocated if sufficient memory does not already exist, and excess memory will never be freed.

Note that reshaping an input blob and immediately calling Net::Backward is an error; either Net::Forward or Net::Reshape need to be called to propagate the new input shape to higher layers.

template<typename Dtype>
void Blob< Dtype >::ShareData ( const Blob< Dtype > &  other)

Set the data_ shared_ptr to point to the SyncedMemory holding the data_ of Blob other – useful in Layer&s which simply perform a copy in their Forward pass.

This deallocates the SyncedMemory holding this Blob's data_, as shared_ptr calls its destructor when reset with the "=" operator.


The documentation for this class was generated from the following file: