Human Generated Data

Title

Untitled (four people at ball)

Date

1965

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19280

Human Generated Data

Title

Untitled (four people at ball)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1965

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.6
Human 98.6
Apparel 98.4
Clothing 98.4
Person 97.9
Person 97.8
Person 94.7
Sleeve 86.8
Home Decor 83.4
Coat 77.5
Overcoat 75.4
Long Sleeve 70.4
Suit 65.2
Flooring 63.4
Linen 58.7
Furniture 58
Chair 58
Gown 57.1
Fashion 57.1
Robe 57.1
Evening Dress 57.1
Priest 56.8
Female 56.7
Shirt 56.5
Floor 55.7
Crowd 55.1

Imagga
created on 2022-03-05

man 37
corporate 32.7
business 31.6
male 31.3
professional 31.1
groom 30.7
businessman 30
people 29
adult 27.9
happy 26.9
executive 25
teacher 23.9
team 23.3
group 22.6
suit 22.2
kin 21.9
meeting 21.7
men 21.5
office 21.4
building 21.4
person 20.7
teamwork 19.5
businesswoman 18.2
educator 18
couple 17.4
women 17.4
two 16.9
attractive 16.8
smiling 16.6
smile 15.7
success 15.3
career 14.2
job 14.2
happiness 14.1
worker 13.7
black 13.5
partnership 13.4
work 13.3
day 13.3
manager 13
successful 12.8
corporation 12.5
diversity 12.5
boss 12.4
ethnic 12.4
businesspeople 12.3
lifestyle 12.3
bride 12.3
pretty 11.9
love 11.8
handshake 11.7
portrait 11.6
adults 11.4
standing 11.3
company 11.2
wedding 11
communication 10.9
colleagues 10.7
family 10.7
full length 10.7
handsome 9.8
life 9.8
partner 9.7
talking 9.5
walking 9.5
youth 9.4
employee 9.2
briefcase 9.1
modern 9.1
fashion 9
dress 9
outdoors 9
clothing 8.9
diverse 8.8
hands 8.7
leadership 8.6
architecture 8.6
sitting 8.6
new 8.1
interior 8
indoors 7.9
shake 7.8
40s 7.8
discussion 7.8
deal 7.8
outside 7.7
commerce 7.5
church 7.4
cheerful 7.3
lady 7.3
indoor 7.3
looking 7.2
bright 7.1
working 7.1
together 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

standing 97.2
clothing 97
person 96.5
dress 96
woman 94.7
man 91.7
text 87.3
suit 81.2
posing 68
smile 65.5
gallery 59

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Calm 61.2%
Sad 28.5%
Angry 3.8%
Happy 2.9%
Confused 1.6%
Disgusted 0.9%
Surprised 0.8%
Fear 0.4%

AWS Rekognition

Age 38-46
Gender Male, 99.3%
Surprised 23.2%
Calm 22.4%
Angry 19.7%
Sad 15.7%
Happy 7.6%
Fear 4.3%
Disgusted 4.1%
Confused 2.9%

AWS Rekognition

Age 35-43
Gender Female, 57.5%
Calm 84.8%
Surprised 11.2%
Happy 1.3%
Sad 0.9%
Fear 0.7%
Disgusted 0.6%
Confused 0.4%
Angry 0.2%

AWS Rekognition

Age 40-48
Gender Female, 70.5%
Calm 96%
Happy 2.2%
Sad 0.7%
Surprised 0.4%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%

Captions

Microsoft

a man and a woman standing in front of a window 89.9%
a group of people standing in front of a window 89.8%
a group of people standing in front of a window posing for the camera 89.7%

Text analysis

Amazon

12
H
EXIT
XAOOX
MinŲ­

Google

T33
A2
12
12 EXIT MJI3 Y T33 A2
EXIT
Y
MJI3