Human Generated Data

Title

Untitled (bride and groom departing)

Date

1970s, printed later

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1037

Human Generated Data

Title

Untitled (bride and groom departing)

People

Artist: Bachrach Studios, founded 1868

Date

1970s, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.2
Human 99.2
Person 95.6
Person 94
Apparel 93
Clothing 93
Vehicle 86.5
Transportation 86.5
Car 86.5
Automobile 86.5
Pedestrian 74.6
Coat 74.3
Overcoat 71
Person 68.2
Face 56.3

Imagga
created on 2022-02-26

car 100
motor vehicle 100
limousine 100
vehicle 43.9
transportation 41.2
automobile 41.2
auto 38.3
wheeled vehicle 29.7
transport 29.2
minivan 24.9
drive 24.6
wheel 23.6
road 23.5
driver 23.3
beach wagon 22.4
smile 20.7
travel 20.4
happy 20
driving 19.3
motor 18.4
van 18
passenger van 17.5
man 17.5
adult 17.5
cab 17
person 16.6
sitting 16.3
people 16.2
smiling 15.9
truck 15.5
male 14.9
portrait 13.6
outdoors 12.7
new 12.1
street 12
happiness 11.8
traffic 11.4
luxury 11.1
pretty 10.5
attractive 10.5
garage 10.3
joy 10
modern 9.8
cheerful 9.8
insurance 9.7
door 9.6
urban 9.6
casual 9.3
speed 9.2
city 9.1
fashion 9
fun 9
style 8.9
business 8.5
trip 8.5
power 8.4
summer 8.4
sports 8.3
window 8.2
hair 7.9
parked 7.9
black 7.8
passenger 7.7
engine 7.7
fast 7.5
inside 7.4
looking 7.2
tire 7.2
cool 7.1
day 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

street 97.8
vehicle 95.3
car 95
land vehicle 94.2
text 91.7
city 64.7
person 54.4
wheel 51.5

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-28
Gender Male, 90%
Calm 41.4%
Confused 34%
Sad 18.8%
Happy 2%
Fear 1.5%
Angry 1.1%
Surprised 0.8%
Disgusted 0.5%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 98.2%
Angry 0.9%
Confused 0.3%
Fear 0.2%
Happy 0.2%
Surprised 0.1%
Disgusted 0.1%
Sad 0.1%

AWS Rekognition

Age 23-33
Gender Female, 100%
Happy 99.3%
Surprised 0.3%
Confused 0.1%
Calm 0.1%
Fear 0.1%
Angry 0.1%
Disgusted 0%
Sad 0%

AWS Rekognition

Age 16-24
Gender Male, 94.3%
Calm 91.6%
Angry 3.7%
Sad 1.5%
Happy 1%
Confused 0.6%
Disgusted 0.6%
Fear 0.5%
Surprised 0.5%

Microsoft Cognitive Services

Age 37
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Car 86.5%

Captions

Microsoft

a group of people standing in front of a bus 89.4%
a group of people standing around a bus 87.7%
a group of people in front of a bus 86.2%