Human Generated Data

Title

Untitled (man and two women standing at party)

Date

1952

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20120

Human Generated Data

Title

Untitled (man and two women standing at party)

People

Artist: Peter James Studio, American

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20120

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.9
Apparel 99.9
Tie 99.5
Accessories 99.5
Accessory 99.5
Person 99.4
Human 99.4
Person 99
Dress 98.4
Person 97.9
Shorts 97
Female 96.8
Chair 94.9
Furniture 94.9
Person 94
Sleeve 93.9
Woman 89.8
Home Decor 86.6
Overcoat 85.3
Coat 85.3
Suit 84.3
Long Sleeve 81.4
Shirt 74
Skirt 71.3
Girl 62.6
Photography 59.8
Photo 59.8
Fashion 55.7
Linen 55.3
Evening Dress 55.1
Gown 55.1
Robe 55.1
Hair 55

Clarifai
created on 2023-10-22

people 99.9
group 99.2
adult 97.9
man 97.6
woman 96.9
portrait 94.8
group together 94.2
facial expression 93.8
wear 93.2
three 91.4
monochrome 90.7
music 90.5
two 89.7
five 88.3
medical practitioner 88
actor 86.7
wedding 85.3
several 85.1
four 85.1
outerwear 84

Imagga
created on 2022-03-05

man 34.9
male 34.8
person 33.6
people 31.8
adult 24.6
businessman 22.1
group 21.8
men 21.5
business 21.2
professional 17.8
women 17.4
couple 15.7
black 15.6
suit 13.9
life 13.9
happy 13.8
corporate 13.7
job 13.3
silhouette 13.2
happiness 12.5
family 12.4
lifestyle 12.3
standing 12.2
portrait 11.6
clothing 11.5
teacher 11.5
fashion 11.3
looking 11.2
team 10.7
together 10.5
sky 10.2
teamwork 10.2
human 9.7
crowd 9.6
career 9.5
work 9.4
smiling 9.4
beach 9.3
holding 9.1
dress 9
handsome 8.9
boy 8.7
guy 8.7
nurse 8.7
love 8.7
sport 8.6
tie 8.5
casual 8.5
friendship 8.4
attractive 8.4
executive 8.3
girls 8.2
businesswoman 8.2
golfer 8.1
success 8
sea 7.9
day 7.8
worker 7.8
educator 7.8
husband 7.8
groom 7.7
modern 7.7
world 7.7
player 7.7
bride 7.7
youth 7.7
wife 7.6
walking 7.6
power 7.6
meeting 7.5
manager 7.4
style 7.4
light 7.3
occupation 7.3
home 7.2
shadow 7.2
patient 7.1
summer 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.2
person 94.3
man 92.7
clothing 87.4
black and white 84
smile 75
white 72.5
drawing 68.8
posing 65.4
woman 51.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 85.1%
Happy 76.2%
Surprised 10.1%
Calm 6.8%
Confused 3.8%
Disgusted 1.2%
Sad 0.7%
Angry 0.7%
Fear 0.5%

AWS Rekognition

Age 41-49
Gender Male, 98.9%
Happy 83.7%
Confused 4.8%
Calm 3.6%
Surprised 2.9%
Sad 2.7%
Disgusted 1.1%
Fear 0.9%
Angry 0.4%

AWS Rekognition

Age 53-61
Gender Male, 99.9%
Calm 97.9%
Sad 1%
Happy 0.5%
Confused 0.4%
Disgusted 0.1%
Angry 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Male, 82.4%
Calm 95.5%
Sad 2.4%
Angry 0.6%
Confused 0.4%
Disgusted 0.4%
Surprised 0.3%
Happy 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Tie
Person
Chair
Suit
Tie 99.5%
Person 99.4%
Person 99%
Person 97.9%
Person 94%
Chair 94.9%
Suit 84.3%

Categories

Text analysis

Amazon

E2D

Google

YT37A°2-XA
YT37A°2-XA