Hello, I can't understand
#define WIN32_LEAN_AND_MEAN
#include <windows.h>
int e_p()
{
LPBYTE PE = HeapAlloc(GetProcessHeap(),0,1024);
DWORD *pointer,*pointer2 = 0;
if(PE != NULL)
{
pointer = (DWORD*)(PE + 0xC);
pointer2 = (DWORD*)(PE + 0xC); //pointer 2 equal pointer
pointer2 = (DWORD*)PE + 0xC; //no equal!
}
ExitProcess(0);
}
Why pointer != pointer2 in last line? ??? I thought it would be equal, but no.
Because in
pointer2 = (DWORD*)(PE + 0xC);
'PE' is a byte pointer, to which you add 0xC and by the pointers arithmetic you get 'PE = PE + (sizeof(BYTE) * 0xC)'.
Then you cast 'PE' to a pointer to 'DWORD'.
In the second case
pointer2 = (DWORD*)PE + 0xC;
First you cast 'PE' to a 'DWORD' pointer then add the offset. In this case the pointers arithmetic works differently: 'PE = PE + (sizeof(DWORD) * 0xC)'. And because the size of a DWORD=4, you're adding 0xC*4.
Thank you frankie !