Human Generated Data

Title

Untitled (scientists looking into pool of water at research facility)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15978

Human Generated Data

Title

Untitled (scientists looking into pool of water at research facility)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 96.9
Person 96.9
Building 88.9
Indoors 86.4
Interior Design 86.4
Architecture 86.1
Elevator 84.6
Banister 83
Handrail 83
Person 69.1
Transportation 60.5
Train 60.5
Vehicle 60.5
Housing 59.1

Imagga
created on 2022-02-05

locker 55.4
fastener 44.3
case 39.8
device 35.4
restraint 33.3
industrial 29.9
interior 28.3
industry 28.2
power 26
building 26
steel 25.6
factory 25.1
modern 24.5
metal 22.5
urban 21.8
equipment 21.6
architecture 21.4
technology 19.3
inside 17.5
business 17
city 16.6
pipe 16.5
glass 16.3
construction 16.2
structure 15.9
window 15.8
plant 15.7
light 15.4
pollution 14.4
energy 14.3
production 13.6
office 13.2
tube 12.7
reflection 12.4
elevator 12.4
heavy 12.4
engineering 12.4
system 12.4
environment 12.3
oil 12.1
pipes 11.8
machine 11.8
manufacturing 11.7
supply 11.6
gas 11.5
sky 11.5
hall 11.3
heat 11.1
work 11
design 10.7
science 10.7
fuel 10.6
center 10.3
lifting device 10
machinery 9.7
station 9.7
computer 9.7
steam 9.7
complex 9.7
television 9.4
water 9.3
futuristic 9
monitor 9
corridor 8.8
pump 8.8
indoors 8.8
waste 8.7
stainless 8.7
room 8.7
effects 8.5
3d 8.5
electricity 8.5
perspective 8.5
environmental 8.5
electric 8.4
iron 8.4
economy 8.3
indoor 8.2
global 8.2
digital 8.1
new 8.1
door 7.9
piping 7.9
refinery 7.9
valve 7.9
bright 7.9
cables 7.8
airport 7.8
sliding door 7.8
high 7.8
education 7.8
mall 7.8
technical 7.7
chemical 7.7
storage 7.6
communication 7.6
hot 7.5
house 7.5
three dimensional 7.5

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

water 87.1
text 85.5
swimming pool 51.3

Face analysis

Amazon

Google

AWS Rekognition

Age 59-69
Gender Male, 99.9%
Calm 84.8%
Happy 6%
Confused 4.8%
Sad 1.6%
Angry 1.1%
Surprised 0.9%
Disgusted 0.5%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.9%
Train 60.5%

Captions

Microsoft

a glass display case 31.4%
a close up of a glass display case 29.6%
a display in a store 29.5%

Text analysis

Amazon

C8
89
88
DS
ى
AB
A6
16 DS
IFS
A6 . AB
IC
MJIR
1.
VISING
MJIR VISING МАБОА
15
16
.
МАБОА

Google

83
NA2 A3 A4 AG A7 AB A BI 82 83 P B7 B8 B9 E8 E9 YT37AP2
NA2
AG
BI
P
B9
A3
A7
AB
A
B7
B8
E9
A4
82
E8
YT37AP2