Home
last modified time | relevance | path

Searched refs:MPI (Results 1 – 25 of 44) sorted by relevance

12

/external/ltp/tools/netpipe-2.4/
DMakefile48 MPI: NPmpi target
51 NPmpi: NPmpi.o MPI.o
52 …$(CC) $(CFLAGS) NPmpi.o MPI.o -o NPmpi -L $(MPI_HOME)/lib/$(MPI_ARCH)/$(MPI_DEVICE) -lmpi $(EXTRA…
57 MPI.o: MPI.c MPI.h $(INCLUDES)
58 $(CC) $(CFLAGS) -DMPI -I$(MPI_HOME)/include -c MPI.c
DREADME63 Berkeley sockets interface), MPI, and PVM. If you do not have MPI or
66 by the TCP, MPI and PVM interfaces.
71 as the CFLAGS compiler flags, required extra libraries, and MPI or PVM
78 command "make TCP", "make MPI", or "make PVM" as appropriate,
83 over TCP, MPI, or PVM, and the following section on interpreting the
127 For MPI, how you run NPmpi may depend on the MPI implementation you
142 To find out how to run NPmpi using any other implementation of MPI,
145 The NetPIPE options for MPI are:
150 May not have any effect, depending on your MPI
239 * Add dummy "echo" commands after TCP, MPI, and PVM targets in the
Dnetpipe.h47 #elif defined(MPI)
Dnetpipe.c67 #ifdef MPI in main()
72 #ifndef MPI in main()
/external/ltp/tools/netpipe-2.4-ipv6/
DMakefile48 MPI: NPmpi target
51 NPmpi: NPmpi.o MPI.o
52 …$(CC) $(CFLAGS) NPmpi.o MPI.o -o NPmpi -L $(MPI_HOME)/lib/$(MPI_ARCH)/$(MPI_DEVICE) -lmpi $(EXTRA…
57 MPI.o: MPI.c MPI.h $(INCLUDES)
58 $(CC) $(CFLAGS) -DMPI -I$(MPI_HOME)/include -c MPI.c
DREADME63 Berkeley sockets interface), MPI, and PVM. If you do not have MPI or
66 by the TCP, MPI and PVM interfaces.
71 as the CFLAGS compiler flags, required extra libraries, and MPI or PVM
78 command "make TCP", "make MPI", or "make PVM" as appropriate,
83 over TCP, MPI, or PVM, and the following section on interpreting the
127 For MPI, how you run NPmpi may depend on the MPI implementation you
142 To find out how to run NPmpi using any other implementation of MPI,
145 The NetPIPE options for MPI are:
150 May not have any effect, depending on your MPI
239 * Add dummy "echo" commands after TCP, MPI, and PVM targets in the
Dnetpipe.h48 #elif defined(MPI)
Dnetpipe.c104 #ifdef MPI in main()
109 #ifndef MPI in main()
/external/clang/lib/StaticAnalyzer/Checkers/
DCMakeLists.txt44 MPI-Checker/MPIBugReporter.cpp
45 MPI-Checker/MPIChecker.cpp
46 MPI-Checker/MPIFunctionClassifier.cpp
DAndroid.bp9 subdirs = ["MPI-Checker"]
/external/valgrind/docs/internals/
Dmpi2entries.txt1 Canned summary of MPI-1.1/MPI-2 entry points, as derived from mpi.h
2 from Open MPI svn rev 9191 (somewhere between Open MPI versions 1.0.1
D3_9_BUGSTATUS.txt153 === MPI ================================================================
D3_7_BUGSTATUS.txt44 287862 MPI_IN_PLACE not supported for MPI collect
D3_10_BUGSTATUS.txt296 === MPI ================================================================
D3_2_BUGSTATUS.txt141 and makes a valid MPI program crash.
/external/llvm/lib/Transforms/Utils/
DMemorySSA.cpp1166 for (auto MPI = upward_defs_begin(PHIPair), MPE = upward_defs_end(); in UpwardsDFSWalk() local
1167 MPI != MPE; ++MPI) { in UpwardsDFSWalk()
1170 DT->dominates(CurrAccess->getBlock(), MPI.getPhiArgBlock()); in UpwardsDFSWalk()
1173 UpwardsDFSWalk(MPI->first, MPI->second, Q, Backedge); in UpwardsDFSWalk()
/external/llvm/lib/Target/WebAssembly/
DWebAssemblyRegStackify.cpp142 const MachinePointerInfo &MPI = MMO->getPointerInfo(); in Query() local
143 if (MPI.V.is<const PseudoSourceValue *>()) { in Query()
144 auto PSV = MPI.V.get<const PseudoSourceValue *>(); in Query()
/external/clang/include/clang/StaticAnalyzer/Checkers/
DCheckers.td75 def MPI : Package<"mpi">, InPackage<OptIn>;
582 let ParentPackage = MPI in {
583 def MPIChecker : Checker<"MPI-Checker">,
584 HelpText<"Checks MPI code">,
/external/llvm/lib/Target/PowerPC/
DPPCISelLowering.cpp4452 MachinePointerInfo MPI(CS ? CS->getCalledValue() : nullptr); in PrepareCall() local
4453 SDValue LoadFuncPtr = DAG.getLoad(MVT::i64, dl, LDChain, Callee, MPI, in PrepareCall()
4460 MPI.getWithOffset(16), false, false, in PrepareCall()
4466 MPI.getWithOffset(8), false, false, in PrepareCall()
6408 MachinePointerInfo MPI = in LowerFP_TO_INTForReuse() local
6416 MF.getMachineMemOperand(MPI, MachineMemOperand::MOStore, 4, 4); in LowerFP_TO_INTForReuse()
6422 MPI, false, false, 0); in LowerFP_TO_INTForReuse()
6429 MPI = MPI.getWithOffset(Subtarget.isLittleEndian() ? 0 : 4); in LowerFP_TO_INTForReuse()
6434 RLI.MPI = MPI; in LowerFP_TO_INTForReuse()
6480 return DAG.getLoad(Op.getValueType(), dl, RLI.Chain, RLI.Ptr, RLI.MPI, false, in LowerFP_TO_INT()
[all …]
DPPCISelLowering.h751 MachinePointerInfo MPI; member
/external/valgrind/
DNEWS.old480 Lackey has been improved, and MPI support has been added. In detail:
548 - MPI support: partial support for debugging distributed applications
549 using the MPI library specification has been added. Valgrind is
550 aware of the memory state changes caused by a subset of the MPI
581 again, and was required for MPI support.
1026 use valgrind for debugging MPI-based programs. The relevant
DNEWS1287 n-i-bz Fixes for more MPI false positives
1722 troublesome pieces of code. The MPI wrapper library (libmpiwrap.c)
2665 * For people who use Valgrind with MPI programs, the installed
Dconfigure.ac3607 # MPI checks
3609 # Do we have a useable MPI setup on the primary and/or secondary targets?
/external/llvm/lib/CodeGen/AsmPrinter/
DCodeViewDebug.cpp1224 MemberPointerInfo MPI( in lowerTypeMemberPointer() local
1226 PointerRecord PR(PointeeTI, PK, PM, PO, SizeInBytes, MPI); in lowerTypeMemberPointer()
/external/clang/include/clang/Basic/
DAttrDocs.td1315 * MPI library implementations, where these attributes enable checking that
1317 * for HDF5 library there is a similar use case to MPI;

12