Human Generated Data

Title

Untitled (young men and women at table at ball)

Date

1962

People

Artist: Robert Burian, American active 1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19179

Human Generated Data

Title

Untitled (young men and women at table at ball)

People

Artist: Robert Burian, American active 1940s-1950s

Date

1962

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Apparel 99.9
Clothing 99.9
Furniture 99.3
Person 99
Human 99
Person 99
Person 98.9
Person 98.9
Person 98.7
Person 98
Person 97.8
Chair 97
Fashion 95.9
Robe 95.9
Gown 94.3
Wedding 94.1
Person 92.7
Chair 92.6
Person 91.7
Bride 84.1
Wedding Gown 84.1
Indoors 83.9
Bridegroom 78.1
People 76.2
Table 75.7
Suit 74.7
Overcoat 74.7
Coat 74.7
Room 73.3
Female 71.8
Person 71.4
Helmet 67.2
Sunglasses 66
Accessories 66
Accessory 66
Dining Table 65
Photography 64.8
Photo 64.8
Chair 62.8
Portrait 62.1
Face 62.1
Dress 59.9
Evening Dress 58.6
Crowd 58.6
Sunglasses 57.3
Woman 56.9

Imagga
created on 2022-03-05

man 40.3
person 39
male 34.7
senior 32.8
people 32.3
home 31.9
couple 30.5
indoors 28.1
adult 27
sitting 25.8
room 24.2
elderly 22
mature 21.4
smiling 21
patient 20.6
men 20.6
office 20.2
happy 20
professional 19.5
businessman 19.4
together 19.3
meeting 18.8
teacher 18.7
indoor 18.2
women 18.2
table 18.2
business 17.6
talking 17.1
group 16.9
lifestyle 16.6
retired 15.5
computer 15.2
happiness 14.9
worker 14.5
desk 14.2
occupation 13.7
colleagues 13.6
mid adult 13.5
medical 13.2
old 13.2
care 13.2
cheerful 13
educator 13
chair 12.8
70s 12.8
two people 12.6
family 12.4
case 12.4
groom 12.1
casual 11.9
communication 11.8
grandfather 11.6
hospital 11.6
executive 11.5
retirement 11.5
smile 11.4
businesspeople 11.4
inside 11
two 11
portrait 11
work 11
aged 10.9
casual clothing 10.8
sick person 10.6
life 10.5
health 10.4
nurse 10.3
teamwork 10.2
laptop 10.1
color 10
face 9.9
hairdresser 9.9
suit 9.9
team 9.9
handsome 9.8
salon 9.8
40s 9.7
interior 9.7
working 9.7
older 9.7
30s 9.6
looking 9.6
corporate 9.4
day 9.4
kin 9.4
restaurant 9.3
help 9.3
camera 9.2
modern 9.1
businesswoman 9.1
holding 9.1
pensioner 9
caring 8.8
living room 8.8
couch 8.7
love 8.7
husband 8.6
illness 8.6
enjoying 8.5
adults 8.5
barbershop 8.4
horizontal 8.4
relaxing 8.2
job 8
entrepreneur 7.9
middle aged 7.8
looking camera 7.7
lunch 7.6
doctor 7.5
friends 7.5
relaxed 7.5
enjoyment 7.5
screen 7.5
friendship 7.5
phone 7.4

Google
created on 2022-03-05

Chair 88.8
Black-and-white 84.6
Style 83.9
Monochrome photography 73.1
Event 72.7
Monochrome 72.7
Vintage clothing 71.8
Classic 67.9
Table 65.9
Room 65.9
Art 65.8
History 65
Suit 63.8
Sitting 63.4
Curtain 58
Visual arts 57.7
Team 57.4

Microsoft
created on 2022-03-05

person 97.6
text 95
clothing 85.7
chair 70
people 68.3
black and white 54.3
man 52.6
furniture 51.4

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 98%
Happy 85.1%
Calm 5.9%
Sad 5.9%
Disgusted 1%
Confused 1%
Surprised 0.6%
Angry 0.4%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99.3%
Sad 65.8%
Happy 13%
Confused 11.3%
Calm 3.7%
Surprised 2.7%
Disgusted 1.6%
Angry 1.4%
Fear 0.5%

AWS Rekognition

Age 27-37
Gender Male, 73.5%
Calm 98.6%
Happy 0.6%
Surprised 0.2%
Sad 0.2%
Confused 0.2%
Disgusted 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Female, 53.7%
Calm 40.1%
Happy 34.9%
Sad 16.3%
Angry 2.7%
Confused 2.4%
Fear 2%
Surprised 0.9%
Disgusted 0.7%

AWS Rekognition

Age 29-39
Gender Female, 55.2%
Happy 94.4%
Surprised 2.6%
Calm 1.7%
Confused 0.3%
Disgusted 0.3%
Sad 0.2%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 35-43
Gender Male, 92.4%
Calm 98.1%
Surprised 1.3%
Sad 0.2%
Disgusted 0.1%
Happy 0.1%
Angry 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 31-41
Gender Female, 71%
Calm 94.1%
Sad 3.2%
Happy 0.8%
Surprised 0.5%
Disgusted 0.4%
Confused 0.4%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Male, 98.5%
Confused 27.9%
Calm 24%
Sad 21.3%
Happy 12.4%
Disgusted 5.4%
Angry 4.3%
Surprised 2.4%
Fear 2.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Chair 97%
Helmet 67.2%

Captions

Microsoft

a group of people sitting in chairs 96.4%
a group of people sitting in a chair 95.3%
a group of people sitting next to a window 91.6%

Text analysis

Amazon

a
g

Google

D.
D.