Human Generated Data

Title

Untitled (small group standing around birthday cake)

Date

1950

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2046

Human Generated Data

Title

Untitled (small group standing around birthday cake)

People

Artist: Hamblin Studio, American active 1930s

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.4
Person 99.4
Person 99.1
Person 99.1
Person 99
Person 99
Person 98.8
Person 98.8
Clothing 98.1
Apparel 98.1
Person 96.5
Furniture 92.6
Tie 89.3
Accessories 89.3
Accessory 89.3
Table 85.8
Chair 81.5
People 81.2
Dress 77.4
Dining Table 77.2
Indoors 71.4
Tablecloth 69.7
Plant 62.8
Suit 61.7
Coat 61.7
Overcoat 61.7
Photo 60.9
Photography 60.9
Flower 60.6
Blossom 60.6
Home Decor 60.3
Room 58.2
Footwear 56.4
Shoe 56.4
Person 56.1
Sideboard 55.1

Imagga
created on 2021-12-14

teacher 44.3
professional 42.2
person 40
man 38.3
people 37.9
male 35.5
businessman 34.4
business 32.2
adult 31
meeting 30.2
group 29.8
educator 28.6
office 28.2
happy 28.2
team 26.9
executive 26.7
corporate 26.6
women 26.1
men 25.8
businesswoman 25.5
sitting 23.2
teamwork 23.2
job 22.1
room 21.9
smiling 21.7
table 20.8
work 20.4
worker 19.6
together 19.3
portrait 18.8
communication 18.5
businesspeople 18
couple 17.4
boss 17.2
outfit 17
successful 16.5
success 16.1
indoors 15.8
modern 15.4
smile 15
manager 14.9
performer 14.7
dancer 14.7
suit 14.4
desk 14.2
classroom 13.9
conference 13.7
home 13.6
lifestyle 13
marimba 12.9
holding 12.4
standing 12.2
mature 12.1
musical instrument 12
indoor 11.9
two 11.9
percussion instrument 11.7
colleagues 11.7
interior 11.5
working 11.5
employee 11.5
talking 11.4
presentation 11.2
student 11.1
entertainer 11
chair 10.4
education 10.4
life 10.4
company 10.2
confident 10
pretty 9.8
attractive 9.8
cheerful 9.8
40s 9.7
diversity 9.6
partnership 9.6
ethnic 9.5
career 9.5
plan 9.5
happiness 9.4
casual 9.3
laptop 9.1
board 9
family 8.9
coworkers 8.8
looking 8.8
partner 8.7
class 8.7
leader 8.7
hall 8.7
staff 8.6
finance 8.5
ideas 8.4
training 8.3
phone 8.3
groom 8.3
friendly 8.2
new 8.1
waiter 7.9
diverse 7.8
casual clothing 7.8
hands 7.8
teaching 7.8
nurse 7.8
formal 7.6
enjoying 7.6
smart 7.5
showing 7.5
document 7.4
hospital 7.3
lady 7.3
color 7.2
handsome 7.1
day 7.1
paper 7.1

Google
created on 2021-12-14

Table 94.3
Furniture 94.1
Chair 84
Suit 79.5
Desk 74
Monochrome 73.1
Art 72.6
Monochrome photography 72.3
Event 72
Vintage clothing 69.8
Room 69
Classic 66.9
Window 62.9
History 62.7
Font 62.4
Stock photography 62.1
Rectangle 61.7
Sitting 58.4
Team 56.6
Visual arts 54.2

Microsoft
created on 2021-12-14

clothing 96.7
person 96.6
woman 95.6
dress 93.2
text 89.9
wedding dress 88.5
smile 87.6
posing 86.7
standing 84.2
bride 80.1
people 57.7
man 50.4
clothes 16.8

Face analysis

Amazon

Google

AWS Rekognition

Age 19-31
Gender Female, 88.3%
Sad 60.4%
Calm 25.3%
Confused 8.4%
Happy 2.3%
Surprised 1.6%
Angry 1.2%
Disgusted 0.4%
Fear 0.3%

AWS Rekognition

Age 20-32
Gender Male, 96.1%
Calm 94.7%
Sad 2.8%
Surprised 0.7%
Confused 0.7%
Happy 0.4%
Angry 0.4%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 22-34
Gender Male, 69.9%
Sad 89.9%
Calm 7.8%
Happy 1.5%
Confused 0.5%
Angry 0.1%
Fear 0.1%
Surprised 0.1%
Disgusted 0%

AWS Rekognition

Age 18-30
Gender Female, 76.2%
Happy 59.9%
Calm 22.1%
Sad 10.4%
Surprised 3.1%
Fear 2%
Confused 1.3%
Angry 0.9%
Disgusted 0.3%

AWS Rekognition

Age 45-63
Gender Male, 96.9%
Calm 79.3%
Sad 12.4%
Happy 4.7%
Surprised 1.6%
Confused 1.4%
Angry 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 31-47
Gender Female, 53.1%
Sad 65.4%
Calm 27%
Happy 3%
Confused 1.2%
Surprised 1.1%
Disgusted 0.9%
Fear 0.8%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Tie 89.3%
Shoe 56.4%

Captions

Microsoft

a group of people posing for a photo 95.7%
a group of people standing in front of a window posing for the camera 89.7%
a person standing in front of a group of people posing for a photo 89%

Text analysis

Amazon

KODVK-SVEELA