Human Generated Data

Title

Untitled ("Hangman's Holiday," man thrown from car to telephone pole)

Date

c. 1940

People

Artist: William Dyviniak,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3603

Human Generated Data

Title

Untitled ("Hangman's Holiday," man thrown from car to telephone pole)

People

Artist: William Dyviniak,

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.5
Human 99.5
Person 96.8
Musician 95.4
Musical Instrument 95.4
Transportation 94.4
Automobile 94.4
Vehicle 94.4
Car 94.4
Wheel 90.3
Machine 90.3
Percussion 84.5
Drummer 84.5
Wheel 79.4
Person 76.8
Person 73.3
Leisure Activities 65.1

Imagga
created on 2021-12-14

wheelchair 30.3
device 26.4
car 25.5
chair 25.4
electrical device 23.6
vehicle 23.5
transportation 18.8
wheel 18
travel 16.9
city 16.6
old 16
building 15.3
seat 15.2
architecture 14.8
model t 14
transport 12.8
outdoors 12.7
equipment 12.5
wheeled vehicle 12.4
forklift 12
industry 11.9
truck 11.8
motor vehicle 11.5
outdoor 10.7
automobile 10.5
machine 10.2
sky 10.2
street 10.1
people 10
black 9.7
auto 9.6
furniture 9.2
person 9.1
adult 9.1
pole 9
garage 8.9
carriage 8.9
man 8.7
house 8.4
rod 8.3
road 8.1
landmark 8.1
work 8.1
water 8
sitting 7.7
senior 7.5
tourism 7.4
light 7.3
industrial 7.3
lamp 7.2
color 7.2

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 96.5
outdoor 91.5
old 91
black 79.6
black and white 73.1
white 66
vintage 59.8

Face analysis

Amazon

Google

AWS Rekognition

Age 34-50
Gender Male, 55.3%
Calm 85%
Sad 6.1%
Happy 4.2%
Angry 2%
Fear 0.9%
Confused 0.8%
Disgusted 0.6%
Surprised 0.5%

AWS Rekognition

Age 31-47
Gender Male, 92.7%
Calm 98.6%
Sad 1%
Happy 0.1%
Angry 0.1%
Confused 0.1%
Surprised 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 17-29
Gender Female, 82.2%
Fear 95%
Surprised 2.2%
Sad 1.6%
Calm 0.5%
Confused 0.3%
Angry 0.2%
Happy 0.2%
Disgusted 0%

AWS Rekognition

Age 23-35
Gender Male, 96.8%
Calm 72.3%
Angry 9.1%
Happy 7.8%
Sad 3%
Fear 2.8%
Surprised 2.2%
Confused 2.2%
Disgusted 0.7%

AWS Rekognition

Age 28-44
Gender Female, 79.7%
Sad 91.8%
Calm 5.5%
Fear 1.1%
Happy 0.6%
Angry 0.4%
Confused 0.3%
Surprised 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.5%
Car 94.4%
Wheel 90.3%

Captions

Microsoft

a vintage photo of a truck 84.9%
a vintage photo of a person riding on the back of a truck 52.7%
a vintage photo of a group of people standing in front of a truck 52.6%