Human Generated Data

Title

Untitled (bride opening presents)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17333

Human Generated Data

Title

Untitled (bride opening presents)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.5
Human 98.5
Person 98.5
Clothing 98
Apparel 98
Person 97.2
Person 95.9
Sunglasses 78.9
Accessory 78.9
Accessories 78.9
Machine 60.7
Coat 59.4
Suit 59.4
Overcoat 59.4
Female 59.3
Indoors 58
Room 58

Imagga
created on 2022-02-26

computer 48.4
laptop 47.6
office 46.1
work 41.6
business 34.6
working 32.7
people 31.2
person 31
man 28.9
professional 28
corporate 25.8
businesswoman 25.4
technology 24.5
happy 24.4
desk 23.6
adult 23.6
job 23
businessman 22.9
smiling 22.4
sitting 22.3
executive 21.5
male 20.6
workplace 18.1
keyboard 17.8
worker 17.8
home 17.5
table 17.1
businesspeople 17.1
notebook 17
attractive 16.8
engineer 16.2
coat 16.1
senior 15.9
lifestyle 15.2
communication 15.1
lab coat 14.9
monitor 14.5
success 14.5
indoors 14.1
confident 13.6
smile 13.5
elderly 13.4
manager 13
wireless 12.4
mature 12.1
teamwork 12
nurse 12
looking 12
secretary 12
education 11.3
pretty 11.2
successful 11
suit 10.8
telephone 10.7
talking 10.5
meeting 10.4
portrait 10.4
contemporary 10.3
men 10.3
glasses 10.2
modern 9.8
cheerful 9.8
typing 9.7
staff 9.6
women 9.5
adults 9.5
face 9.2
phone 9.2
house 9.2
team 9
kitchen 8.9
patient 8.8
clothing 8.7
student 8.6
happiness 8.6
mobile 8.5
two 8.5
hand 8.4
inside 8.3
occupation 8.2
alone 8.2
surgeon 8.1
lady 8.1
interior 8
specialist 7.9
together 7.9
support 7.8
half length 7.8
consultant 7.8
employee 7.7
old 7.7
health 7.6
casual 7.6
age 7.6
pen 7.5
doctor 7.5
one 7.5
company 7.4
center 7.4
group 7.3
active 7.2
handsome 7.1
medical 7.1
day 7.1
paper 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 96
person 93.1
sketch 88.1
cartoon 86.3
drawing 86.2
woman 78.7
black and white 74
clothing 71

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 99.9%
Calm 99.8%
Sad 0.1%
Confused 0%
Disgusted 0%
Fear 0%
Happy 0%
Surprised 0%
Angry 0%

AWS Rekognition

Age 27-37
Gender Male, 96.7%
Sad 80.7%
Calm 7.8%
Surprised 3.1%
Angry 2.7%
Confused 1.9%
Happy 1.9%
Disgusted 1.3%
Fear 0.6%

AWS Rekognition

Age 31-41
Gender Male, 94.3%
Calm 99.7%
Sad 0.1%
Surprised 0.1%
Disgusted 0%
Happy 0%
Confused 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 27-37
Gender Female, 57%
Calm 96.6%
Sad 1.8%
Confused 0.9%
Happy 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Sunglasses 78.9%

Captions

Microsoft

a group of people standing around a table 77.4%
a group of people standing in front of a box 75.8%
a group of people around each other 74.4%

Text analysis

Amazon

RUNNING
PLAIN
REE RUNNING
REE
FRAGIT
ORTON'S
5
MuNA
MuNA VE3203 ОСЛИА
ОСЛИА
VE3203

Google

ORTON'S EE RUNNING FRAGIL
FRAGIL
ORTON'S
EE
RUNNING