Human Generated Data

Title

Untitled (scientists looking into pool of water at research facility)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15976.3

Human Generated Data

Title

Untitled (scientists looking into pool of water at research facility)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 97.7
Person 97.7
Person 95.4
Interior Design 95.4
Indoors 95.4
Room 87.2
Handrail 86
Banister 86
Building 83.9
Architecture 75.5
Housing 71.6
Train 62.4
Transportation 62.4
Vehicle 62.4
Home Decor 57.1
Window 56.6

Imagga
created on 2022-02-05

locker 46.7
fastener 37.6
device 35.6
building 33.9
architecture 33.6
structure 30.4
industry 29
restraint 28.2
industrial 27.2
city 25.8
construction 25.7
steel 25
urban 24.5
elevator 21.8
sky 21.7
high 19.9
case 19
modern 18.9
engineering 18.1
equipment 17.9
metal 17.7
power 17.6
factory 17.4
lifting device 17
interior 16.8
window 15
business 14.6
tower 13.4
technology 13.4
inside 12.9
light 12.7
energy 12.6
house 12.5
office 12.3
reflection 12.2
glass 11.7
station 11.6
cable 11.5
heavy 11.4
electricity 11.3
wire 11.3
plant 11.2
production 10.7
pollution 10.6
travel 10.6
system 10.5
stairs 9.8
pipe 9.7
center 9.7
concrete 9.6
perspective 9.4
environment 9
design 9
hall 9
cables 8.8
airport 8.8
lift 8.8
machinery 8.8
chemical 8.7
tube 8.7
water 8.7
fuel 8.7
oil 8.4
metallic 8.3
silhouette 8.3
crane 8.1
new 8.1
lines 8.1
transportation 8.1
night 8
machine 8
working 7.9
indoors 7.9
facility 7.9
staircase 7.9
walkway 7.8
work 7.8
line 7.8
waste 7.8
complex 7.8
supply 7.7
gas 7.7
windows 7.7
old 7.7
frame 7.5
electric 7.5
iron 7.5
sun 7.4
economy 7.4
street 7.4
transport 7.3
global 7.3
people 7.2

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

indoor 87.9
water 84.3
ship 80.9
text 78.9

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 99.8%
Sad 97.5%
Confused 1.1%
Calm 0.7%
Angry 0.3%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%
Happy 0.1%

AWS Rekognition

Age 31-41
Gender Male, 100%
Calm 99.8%
Sad 0.1%
Angry 0.1%
Surprised 0%
Confused 0%
Disgusted 0%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.7%
Train 62.4%

Captions

Microsoft

a glass display case 34.2%

Text analysis

Amazon

89
88
C8
02
C2
E9
C9
D9
E8 E9
H
A6
08 D9
E8
و
A8
24
B7
08
Fa C8 C9
A6 AT A8
24 - No
2
-
FIE elit B7 88 89
E 2
il 02
No
E
AT
B1
B1 B2|83
FB
All
B2|83
All 12/13/14
F5/F6/F7
F1/F2/F3|F4 F5/F6/F7 FB
z
Fa
FIE
12/13/14
il
F1/F2/F3|F4
elit

Google

2434A6 AT ABIAS E 82 83 37 B8 89 D8 09 - EB E9 E E2
2434A6
D8
E2
E
AT
37
B8
89
E9
ABIAS
83
09
EB
82
-