Human Generated Data

Title

Untitled (scientists looking into pool of water at research facility)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15980.1

Human Generated Data

Title

Untitled (scientists looking into pool of water at research facility)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 97.9
Human 97.9
Person 97
Building 88.2
Banister 85
Handrail 85
Architecture 84.2
Window 58.2

Imagga
created on 2022-02-05

building 32.1
structure 30.1
urban 27.1
architecture 26.7
equipment 24.9
steel 24.7
modern 24.5
industry 23.9
industrial 23.6
power 22.7
city 22.4
television 22.4
metal 19.3
construction 18.8
technology 18.5
factory 18.3
window 18.2
glass 17.9
office 17
sky 16.7
device 16.6
business 16.4
high 15.6
monitor 15.5
engineering 15.2
broadcasting 14.9
energy 13.4
heavy 13.3
effects 13.3
interior 13.3
inside 12.9
plant 12.7
station 12.6
pollution 12.5
reflection 12.3
center 12.2
tube 11.8
machinery 11.7
production 11.7
pipe 11.7
supply 11.6
fuel 11.6
light 11.4
digital 11.3
electricity 11.3
electric 11.2
three dimensional 11.2
telecommunication 11.1
graphics 10.9
locker 10.9
electronic equipment 10.7
environment 10.7
waste 10.7
science 10.7
computer 10.6
gas 10.6
iron 10.3
oil 10.2
3d 10.1
pipes 9.8
billboard 9.8
case 9.8
airport 9.8
hall 9.7
system 9.5
work 9.4
finance 9.3
heat 9.3
global 9.1
facility 9
transportation 9
piping 8.9
fastener 8.8
pump 8.8
machine 8.8
steam 8.7
elevator 8.7
render 8.6
windows 8.6
concrete 8.6
imagination 8.5
design 8.4
signboard 8.4
economy 8.3
financial 8
night 8
pipeline 7.9
petrol 7.8
chimney 7.8
cables 7.8
warming 7.8
chemical 7.7
line 7.7
estate 7.6
electronics 7.6
perspective 7.5
liquid crystal display 7.5
medium 7.4
lifting device 7.3
lines 7.2
travel 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 76
water 65.7
ship 62.8

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 99.8%
Calm 75.5%
Confused 15.2%
Sad 4.5%
Happy 1.4%
Disgusted 1.3%
Surprised 0.9%
Angry 0.7%
Fear 0.5%

AWS Rekognition

Age 24-34
Gender Male, 100%
Sad 63.7%
Calm 34.8%
Confused 0.6%
Angry 0.5%
Happy 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%

Captions

Microsoft

a store front at night 84.5%
a sign above a store 72.2%
a store front at day 72.1%

Text analysis

Amazon

A8
B9
C8
B8
A4
A6
B7 B8 B9
C8 C9
A6 A7 A8 AS
D9
7
C9
B7
AS
A7
EB
A3 A4
FP
D8 D9
MEELA
A3
D8
EB E9
FP 199
E9
MEELA LIFA
FS
Z
7 13 Z
FS F6|F7
199
LITE
F6|F7
LIFA
12
12 R
R
PM
a
13