I develop on a few PCs and two MacBook Pros. My PC has a 4790K w/GTX 980 G1 4GB VRAM, and MacBook Pro 15 has the built in Intel Iris Pro HD card 1.5GB VRAM and i7 2.8 Ghz. The PC is quicker, and smoother FPS, especially in the in game preview window. With PC I can get 60-90 FPS, while Mac stays at 40 FPS, with a scene that’s about 386,000-400k tris. Also if you have 2-3 blueprints open the MBP goes down to 15-20 fps, while the PC w/ GTX 980 maintains over 60! The MacBook is still very usable, albeit, the performance is lower when it goes full screen (retina is 2560 x 1600), or multiple BP are open. Since I develop for Mobile I keep viewport relatively small.
On a side note, I have an AMD machine with crossfire 7850K APU and a r7-240. The fully built app (not in editor), runs at barely 25-30 FPS @ 720p. I also have an 4160T i3 running a GTX 960 w/ 4GB RAM. It runs 1080p at a solid 30-40 FPS on a standalone build. From machine to machine it completely varies. Also varies from between in game editor preview/selected viewport, and external app preview windows. I’m using Mobile HDR too, and also the apps aren’t intended to be run at this high resolution, these tests are sort of moot.
Worst scenario I have is my i5 MacBook Pro 13" non retina (2012) i5 with built in Intel 4000HD display. It’s almost unusable! Even with a tiny viewport preview, let alone full screen.
Just wanted to put my experiences out there, though it may feel like incessant rambling 
These were all done using Unreal 4.12.5. Bottom line, it all depends on what you’re developing for. HDR/LDR Mobile, resolutions, etc are all extraneous factors. I used my Intel Iris Pro 1536 on my Retina MBP, which runs fine in most scenarios for Mobile development. If I were to code a complex 1080p full screen game, with full particle effects, bloom, post processing etc, I wouldn’t go for anything less than a GTX 960,970,980, or the AMD comparable. Also > 2-4GB of VRAM if your project is AAA, for consoles, stand alone window or Mac.