Human Generated Data

Title

Untitled (bride and groom entering building)

Date

c. 1950

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18962

Human Generated Data

Title

Untitled (bride and groom entering building)

People

Artist: Bachrach Studios, founded 1868

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.4
Person 99.4
Person 99.3
Apparel 98.9
Clothing 98.9
Person 97.5
Person 97.3
Automobile 95.4
Car 95.4
Vehicle 95.4
Transportation 95.4
Shorts 94.7
Chair 88.3
Furniture 88.3
Female 81.8
Machine 81.6
Wheel 81.6
Sleeve 69.8
Long Sleeve 67.4
Girl 65.5
Woman 65.4
Car 64.1
Pants 58.6
Sports 58.5
Sport 58.5
Building 58.4
Shirt 58
Table 57.8
Dining Table 57.8
Outdoors 57.6
Leisure Activities 57.4
Coat 57.2
Play 56.1
Sitting 55.7
Fencing 55.2

Imagga
created on 2022-03-05

deck 100
business 30.4
man 25.5
people 25.1
building 20.7
businessman 20.3
corporate 19.8
male 19.1
urban 16.6
businesswoman 16.4
adult 15.7
professional 15.5
success 15.3
transportation 15.2
lifestyle 15.2
city 15
group 14.5
office 14.4
travel 14.1
happy 13.8
men 13.7
day 13.3
working 13.3
person 13
women 12.6
modern 12.6
train 12.6
work 12.6
successful 11.9
station 11.9
transport 11.9
worker 11.7
team 11.6
airport 11.2
suit 10.8
vacation 10.6
human 10.5
outdoors 10.4
walking 10.4
company 10.2
smiling 10.1
smile 10
hall 9.9
metro 9.9
pretty 9.8
job 9.7
interior 9.7
portrait 9.7
career 9.5
sport 9.5
meeting 9.4
journey 9.4
teamwork 9.3
speed 9.2
holding 9.1
20 24 years 8.9
angle 8.7
diversity 8.6
bright 8.6
passenger 8.6
attractive 8.4
communication 8.4
manager 8.4
life 8.1
indoors 7.9
subway 7.9
black 7.8
move 7.7
blurred 7.7
canvas tent 7.4
inside 7.4
occupation 7.3
architecture 7.3
color 7.2
fitness 7.2
happiness 7
together 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97.3
black and white 89.6
tennis 71.2

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 90%
Calm 58.2%
Happy 13.9%
Sad 13.4%
Surprised 6.6%
Fear 2.7%
Confused 2.7%
Angry 1.6%
Disgusted 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Car 95.4%
Wheel 81.6%

Captions

Microsoft

a person standing in front of a store 65.3%
a person standing in front of a building 63.2%
a person standing in front of a store 54.5%

Text analysis

Amazon

6
KODAK
SAFETY
3
s
Z
7

Google

3.
SAFETY
KODÁK
9
3. KODÁK SAFETY 9