Human Generated Data

Title

Untitled (nine women standing in a line)

Date

c. 1950

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19587

Human Generated Data

Title

Untitled (nine women standing in a line)

People

Artist: Samuel Cooper, American active 1950s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19587

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 100
Apparel 100
Dress 99.9
Person 99.8
Human 99.8
Person 99.7
Person 99.7
Person 99.4
Person 99.2
Person 98.8
Person 98.4
Female 98.1
Shoe 96.1
Footwear 96.1
Person 95.6
Woman 91.8
Suit 90.3
Overcoat 90.3
Coat 90.3
Robe 87.1
Fashion 87.1
Gown 82.4
Evening Dress 81.6
Skirt 81.4
People 76.6
Shorts 71.2
Wedding 71.1
Floor 66.8
Girl 64.9
Portrait 63.9
Photography 63.9
Face 63.9
Photo 63.9
Shoe 62.4
Wedding Gown 61
Bridesmaid 59.7
Person 56.6
Tuxedo 55.9
Shoe 53.3

Clarifai
created on 2023-10-22

people 99.9
group 99.8
group together 99
many 97.8
adult 96.4
dancing 95.9
music 95.8
woman 95.8
several 95.1
actress 92.9
dancer 92.6
singer 91.9
wear 91.9
man 91.4
leader 91.3
theater 88.1
child 88
administration 87
actor 86.8
musician 86.7

Imagga
created on 2022-03-05

teacher 44.1
people 37.4
professional 35.5
person 34.8
educator 33.1
man 29.6
adult 25.7
male 23.4
men 23.2
businessman 22.9
nurse 22.6
business 21.2
women 19.8
team 19.7
couple 19.2
happy 18.8
group 18.5
two 16.1
job 15.9
together 15.8
teamwork 14.8
dancer 14.7
corporate 14.6
performer 14.5
smiling 14.5
portrait 14.2
businesswoman 13.6
work 13.3
happiness 13.3
crowd 12.5
smile 12.1
worker 11.4
room 11
suit 10.8
holding 10.7
bride 10.5
boss 10.5
standing 10.4
meeting 10.4
child 10.3
entertainer 10.1
life 10
success 9.7
office 9.6
black 9.6
home 9.6
executive 9.5
love 9.5
day 9.4
wedding 9.2
attractive 9.1
groom 9
boy 8.7
hands 8.7
lifestyle 8.7
uniform 8.5
active 8.5
friends 8.5
manager 8.4
old 8.4
silhouette 8.3
outdoors 8.2
dress 8.1
student 7.9
leader 7.7
pretty 7.7
bouquet 7.5
human 7.5
friendship 7.5
fun 7.5
occupation 7.3
girls 7.3
new 7.3
kin 7.2
looking 7.2
sibling 7.2
family 7.1
to 7.1
indoors 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 99.7
dress 98
clothing 94.7
standing 92.3
woman 91.1
text 85.9
group 84.2
smile 77.7
footwear 75.7
posing 73.3
people 55.9
old 41
female 30

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 95.5%
Calm 97.9%
Happy 1.2%
Surprised 0.4%
Sad 0.2%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%
Angry 0%

AWS Rekognition

Age 42-50
Gender Female, 57.2%
Happy 87.5%
Calm 7.2%
Confused 1.4%
Sad 1%
Surprised 1%
Angry 0.9%
Disgusted 0.9%
Fear 0.2%

AWS Rekognition

Age 45-51
Gender Male, 89.3%
Happy 54.5%
Calm 28.5%
Sad 6.9%
Surprised 5.2%
Disgusted 2.4%
Confused 1.1%
Fear 0.8%
Angry 0.6%

AWS Rekognition

Age 38-46
Gender Female, 51%
Happy 52%
Calm 40.1%
Surprised 5%
Fear 1%
Disgusted 0.6%
Angry 0.5%
Sad 0.5%
Confused 0.4%

AWS Rekognition

Age 45-53
Gender Female, 57.6%
Sad 49.9%
Confused 29.5%
Calm 8.2%
Happy 5.2%
Disgusted 2.3%
Surprised 1.7%
Fear 1.7%
Angry 1.4%

AWS Rekognition

Age 26-36
Gender Female, 53%
Surprised 53.1%
Happy 26.8%
Disgusted 9.2%
Calm 4.9%
Confused 2%
Angry 1.8%
Sad 1.4%
Fear 0.9%

AWS Rekognition

Age 23-33
Gender Male, 98.1%
Sad 59%
Calm 16.5%
Happy 15.5%
Surprised 3.9%
Disgusted 2.5%
Confused 1.5%
Angry 0.8%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.8%
Person 99.7%
Person 99.7%
Person 99.4%
Person 99.2%
Person 98.8%
Person 98.4%
Person 95.6%
Person 56.6%
Shoe 96.1%
Shoe 62.4%
Shoe 53.3%

Text analysis

Amazon

2
40
LIFW
20EE1A LIFW
20EE1A