Human Generated Data

Title

Milan photograph: Wilmarth and two others with "Orange Delta for A.P.S." in progress at factory, 1973

Date

c. 1973

People

Artist: Enzo Nocera, Italian 1944 -

Classification

Archival Material

Credit Line

Harvard Art Museums/Fogg Museum, The Christopher Wilmarth Archive, Gift of Susan Wilmarth-Rabineau, CW2001.905

Human Generated Data

Title

Milan photograph: Wilmarth and two others with "Orange Delta for A.P.S." in progress at factory, 1973

People

Artist: Enzo Nocera, Italian 1944 -

Date

c. 1973

Classification

Archival Material

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Human 99.7
Person 99.7
Person 98.1
Person 95.9
Person 78.1
Silhouette 71.5
Building 69.7
Clothing 65.1
Apparel 65.1
Sitting 63.6
People 63.5
Outdoors 63
Pedestrian 60.7
Worker 57.9
Back 57.1
Flooring 57
Handrail 55.4
Banister 55.4

Imagga
created on 2022-01-16

percussion instrument 27.8
piano 25.5
building 23.9
grand piano 23.4
musical instrument 23.4
silhouette 22.3
business 21.3
architecture 21.2
stringed instrument 20.8
interior 20.3
modern 20.3
people 20.1
keyboard instrument 19.9
glass 19.4
office 18.3
man 17.5
window 15.9
work 14.9
chair 14.8
travel 14.8
passenger 14.7
urban 14
reflection 13.8
boat 12.9
airport 12.7
light 12.7
transportation 12.6
steel 12.5
construction 12
inside 12
gondola 11.7
city 11.6
gate 11.4
device 11.1
counter 11
indoor 11
working 10.6
table 10.6
indoors 10.5
structure 10.3
male 9.9
departure 9.8
hall 9.7
technology 9.6
couple 9.6
corporate 9.4
floor 9.3
worker 9
metal 8.8
water 8.7
person 8.6
men 8.6
industry 8.5
barrier 8.5
station 8.2
laptop 8.2
room 8.1
computer 8.1
black 8
life 8
job 8
lifestyle 7.9
women 7.9
transit 7.9
corridor 7.9
waiting 7.7
meeting 7.5
evening 7.5
transport 7.3
industrial 7.3
center 7
sky 7

Microsoft
created on 2022-01-16

text 99.1
black and white 97.1
street 96.1
person 94.9
monochrome 91.7
clothing 91.1
man 74.9

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 98.1%
Calm 99.2%
Happy 0.4%
Angry 0.3%
Sad 0.1%
Disgusted 0%
Confused 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 21-29
Gender Male, 99.5%
Calm 84.1%
Sad 12.9%
Angry 1.4%
Fear 1%
Confused 0.2%
Disgusted 0.2%
Happy 0.1%
Surprised 0.1%

AWS Rekognition

Age 22-30
Gender Male, 99.2%
Calm 95.1%
Fear 3%
Angry 1.6%
Sad 0.1%
Confused 0.1%
Disgusted 0.1%
Happy 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a man standing in front of a window 90.3%
a man standing next to a window 88.3%
a man that is standing in front of a window 88.2%