Human Generated Data

Title

Untitled (dance class, one boy and group of girls in different costumes)

Date

1960

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18777

Human Generated Data

Title

Untitled (dance class, one boy and group of girls in different costumes)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18777

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 99.4
Human 99.4
Person 99.4
Person 99.3
Person 99.1
Person 98.6
Person 98.5
Person 98.4
Person 98.3
Person 98
Clothing 93
Apparel 93
Interior Design 85.2
Indoors 85.2
Room 81.7
Floor 81.7
Flooring 79.4
Chair 66.5
Furniture 66.5
Kindergarten 66.2
Shoe 64.4
Footwear 64.4
Pants 62.3
Girl 62.1
Female 62.1
People 60.1
Kid 58.9
Child 58.9
Classroom 57.9
School 57.9
Shoe 57.6
Shoe 56.1
Shoe 55.8

Clarifai
created on 2023-10-22

people 99.9
child 99.3
group together 98.9
wear 97.9
group 97.7
boy 97.5
recreation 97
school 96.1
many 95.3
several 94.8
five 93.9
adult 93.8
education 91.4
retro 90.9
man 90.8
woman 88.3
four 85.2
portrait 84.2
elementary school 84.1
family 83.9

Imagga
created on 2022-02-25

people 24.5
interior 19.4
person 18.5
man 17.5
musical instrument 17.2
room 17.1
adult 17.1
sketch 16.7
home 15.9
male 14.9
indoors 14.9
drawing 14.8
dress 14.4
happy 13.8
men 12.9
women 12.6
portrait 12.3
smiling 11.6
family 10.7
shop 10.5
decoration 10.4
violin 10.4
business 10.3
indoor 10
children 10
representation 10
child 10
holding 9.9
wind instrument 9.8
fashion 9.8
style 9.6
clothing 9.6
stringed instrument 9.6
bowed stringed instrument 9.5
play 9.5
elegance 9.2
art 9.2
mother 9.2
old 9
ancient 8.6
happiness 8.6
house 8.4
group 8.1
celebration 8
holiday 7.9
table 7.9
couple 7.8
smile 7.8
outfit 7.8
black 7.8
antique 7.8
chair 7.6
fun 7.5
vintage 7.4
inside 7.4
cheerful 7.3
new 7.3
glass 7.2
color 7.2
lifestyle 7.2
activity 7.2
kitchen 7.1
life 7.1
businessman 7.1
modern 7

Google
created on 2022-02-25

Footwear 98.1
Dress 85.4
Curtain 77.1
Art 76.8
Vintage clothing 75.9
Event 71.4
Picture frame 70.8
Illustration 68.6
Room 68
Child 67.2
Visual arts 62.3
History 59.8
Font 59.6
Drawing 58.9
Class 54.5
Monochrome 54.2
Retro style 54.1
Uniform 52.4
Toddler 52.3
Painting 51

Microsoft
created on 2022-02-25

posing 98
old 96.1
person 90
text 88.1
footwear 87.1
group 85.6
clothing 85.2
white 81
drawing 67.2
child 56.4
vintage 53
team 52.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-14
Gender Female, 58.7%
Fear 51.4%
Calm 34.7%
Sad 4.6%
Angry 3%
Disgusted 2.3%
Surprised 2%
Confused 1.1%
Happy 0.9%

AWS Rekognition

Age 6-12
Gender Male, 98.2%
Angry 61.2%
Confused 16%
Happy 6.6%
Calm 4.6%
Sad 4.4%
Disgusted 3.2%
Fear 2.9%
Surprised 1.1%

AWS Rekognition

Age 6-16
Gender Female, 100%
Calm 91.6%
Confused 4.4%
Happy 1.3%
Surprised 0.9%
Sad 0.7%
Fear 0.5%
Angry 0.3%
Disgusted 0.3%

AWS Rekognition

Age 6-12
Gender Female, 100%
Happy 100%
Angry 0%
Surprised 0%
Calm 0%
Sad 0%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 6-12
Gender Female, 69.6%
Happy 97.7%
Surprised 0.5%
Sad 0.5%
Calm 0.3%
Confused 0.3%
Fear 0.3%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 6-14
Gender Male, 98.4%
Happy 99.8%
Surprised 0%
Fear 0%
Angry 0%
Calm 0%
Sad 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 2-10
Gender Female, 87.8%
Happy 99.4%
Calm 0.5%
Confused 0%
Surprised 0%
Sad 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 4-12
Gender Female, 99.7%
Calm 92.8%
Fear 3.1%
Confused 1.1%
Surprised 0.8%
Sad 0.8%
Angry 0.7%
Happy 0.6%
Disgusted 0.2%

AWS Rekognition

Age 2-8
Gender Female, 57.4%
Happy 77.6%
Angry 9.6%
Confused 4.3%
Calm 3.5%
Surprised 2.6%
Sad 1%
Disgusted 0.8%
Fear 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.4%
Person 99.4%
Person 99.3%
Person 99.1%
Person 98.6%
Person 98.5%
Person 98.4%
Person 98.3%
Person 98%
Shoe 64.4%
Shoe 57.6%
Shoe 56.1%
Shoe 55.8%

Categories

Text analysis

Amazon

8
X

Google

8
8