Human Generated Data

Title

Untitled (bride and groom eating at wedding)

Date

c. 1950

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18953

Human Generated Data

Title

Untitled (bride and groom eating at wedding)

People

Artist: Bachrach Studios, founded 1868

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.8
Human 99.8
Restaurant 99.4
Person 99
Person 96.7
Furniture 93.5
Table 93.5
Dining Table 93.5
Meal 92.4
Food 92.4
Pottery 91.7
Home Decor 91.6
Dish 75.8
Cup 74.8
Coffee Cup 74.8
Cafe 70.7
Saucer 70.4
Food Court 70
Clothing 64.5
Apparel 64.5
Cafeteria 58.6
Shirt 58.4
Dating 56.6
Tablecloth 56.2

Imagga
created on 2022-03-05

man 42.3
person 40.5
office 40.3
male 34
people 34
business 31.6
adult 30.1
sitting 30.1
computer 28.6
businessman 28.2
laptop 27
executive 26
smiling 25.3
professional 24.4
desk 24
table 23.7
working 23
indoors 22.8
work 22.8
senior 22.5
mature 22.3
happy 21.3
worker 21.1
couple 20
businesswoman 20
men 18.9
meeting 18.8
suit 18.1
talking 18.1
corporate 18
smile 17.8
job 17.7
home 17.5
cheerful 17.1
team 17
casual 16.9
together 16.6
colleagues 16.5
businesspeople 16.1
group 16.1
successful 15.6
elderly 15.3
necktie 15
older 14.6
confident 14.6
lifestyle 14.5
looking 14.4
communication 14.3
horizontal 14.2
women 13.4
handsome 13.4
teamwork 13
room 12.9
success 12.9
education 12.1
manager 12.1
teacher 11.9
indoor 11.9
old 11.8
happiness 11.8
husband 11.4
bartender 11.3
bow tie 11.3
notebook 10.8
discussion 10.7
planner 10.5
attractive 10.5
one 10.5
musical instrument 10.5
portrait 10.4
contemporary 10.3
keyboard 10.3
bright 10
to 9.7
retirement 9.6
percussion instrument 9.4
face 9.2
phone 9.2
director 9.2
alone 9.1
modern 9.1
pretty 9.1
aged 9
classroom 9
technology 8.9
60s 8.8
monitor 8.8
40s 8.8
retired 8.7
kin 8.7
partner 8.7
using 8.7
day 8.6
boss 8.6
workplace 8.6
wife 8.5
two 8.5
glasses 8.3
grandfather 8
television 7.9
document 7.9
waiter 7.9
coworkers 7.9
paper 7.8
clothing 7.8
employee 7.8
teaching 7.8
two people 7.8
four 7.7
staff 7.7
age 7.6
student 7.6
tie 7.6
career 7.6
adults 7.6
sit 7.6
company 7.4
camera 7.4
secretary 7.3
occupation 7.3
speaker 7.3
spectator 7.2
school 7.2
coat 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 98.8
curtain 90.2
man 84.7
bottle 82.5
clothing 76.4
text 75.9
tableware 68.9
table 66.7
people 56.7
wine glass 54.8
meal 18.1

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Female, 97.1%
Calm 92.2%
Confused 1.8%
Happy 1.5%
Angry 1.4%
Sad 1.3%
Surprised 0.8%
Disgusted 0.7%
Fear 0.3%

AWS Rekognition

Age 18-24
Gender Female, 97.5%
Surprised 97.8%
Happy 2%
Calm 0.1%
Angry 0.1%
Confused 0%
Disgusted 0%
Fear 0%
Sad 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people sitting at a table in front of a curtain 95.4%
a group of people sitting at a table 95.3%
a group of people sitting around a table in front of a curtain 94.4%

Text analysis

Amazon

HS
a

Google

•.....
•.....