Human Generated Data

Title

Untitled (cheerleaders, Pinkerton Academy)

Date

1959

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18868

Human Generated Data

Title

Untitled (cheerleaders, Pinkerton Academy)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18868

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.7
Human 98.7
Person 98.7
Person 98.6
Person 98.3
Person 98.1
Person 96.9
Person 95.9
Clothing 95.7
Apparel 95.7
Person 93.5
Shorts 92.3
Person 91.8
Female 89.9
Chair 82.2
Furniture 82.2
Room 80.3
Indoors 80.3
Dress 78.4
Woman 69.6
Girl 69.1
Sailor Suit 69
People 65.6
Kid 65.2
Child 65.2
Floor 63.8
School 59.3
Crowd 58.9
Skirt 58.6
Flooring 58.4

Clarifai
created on 2023-10-22

people 100
group together 99.4
group 99.2
many 98.4
child 98.1
adult 97.6
uniform 97.4
woman 97.3
wear 95.3
boy 94.8
man 94
several 93.6
education 92.9
outfit 92.3
recreation 92
administration 90
leader 89.8
military 89.5
school 88.3
elementary school 87.1

Imagga
created on 2022-03-05

people 23.4
kin 22.6
person 22.6
man 19.6
male 18.5
men 16.3
sport 15.3
athlete 15
adult 14.6
room 13
child 12.7
portrait 12.3
player 12.3
business 12.1
group 12.1
black 12
ballplayer 11
world 10.5
musical instrument 10.1
modern 9.8
family 9.8
fun 9
team 9
ball 8.9
lifestyle 8.7
day 8.6
performer 8.5
face 8.5
youth 8.5
city 8.3
fashion 8.3
street 8.3
vintage 8.3
human 8.2
girls 8.2
exercise 8.2
happy 8.1
dress 8.1
fitness 8.1
active 8.1
holiday 7.9
urban 7.9
happiness 7.8
boy 7.8
play 7.8
attractive 7.7
head 7.6
style 7.4
home 7.2

Google
created on 2022-03-05

Photograph 94.2
Dress 85.5
Black-and-white 85.4
Style 83.9
Snapshot 74.3
Chair 74.2
Art 73.2
Monochrome photography 73
Monochrome 72.9
Vintage clothing 72.7
Team 70
Event 68.3
Dance 67.5
Room 65.4
Pattern 64.2
Child 63
Stock photography 62.1
Uniform 59.7
Window 58.5
Crew 56.6

Microsoft
created on 2022-03-05

clothing 95.3
text 93.7
person 93
posing 46.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 95.8%
Calm 82.5%
Happy 5.3%
Sad 4.1%
Confused 2.3%
Disgusted 1.9%
Angry 1.8%
Surprised 1.3%
Fear 0.8%

AWS Rekognition

Age 26-36
Gender Male, 98.6%
Sad 74.9%
Happy 11.7%
Calm 4.2%
Confused 3.2%
Angry 2.5%
Surprised 1.5%
Disgusted 1.1%
Fear 1%

AWS Rekognition

Age 33-41
Gender Male, 98.2%
Happy 52.8%
Disgusted 27.4%
Sad 8.8%
Calm 4.8%
Confused 3.5%
Surprised 1.5%
Angry 0.7%
Fear 0.4%

AWS Rekognition

Age 43-51
Gender Female, 97.2%
Happy 97.3%
Calm 1.3%
Sad 0.4%
Surprised 0.3%
Disgusted 0.2%
Confused 0.2%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 33-41
Gender Female, 70.2%
Happy 50.7%
Calm 25.2%
Sad 12.9%
Angry 3.4%
Disgusted 2.1%
Surprised 2.1%
Confused 2.1%
Fear 1.6%

AWS Rekognition

Age 29-39
Gender Female, 90.4%
Calm 91.2%
Sad 4.1%
Happy 1.6%
Confused 1%
Angry 0.9%
Surprised 0.6%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 33-41
Gender Male, 96.7%
Calm 37.8%
Sad 26%
Fear 13.3%
Happy 7.9%
Surprised 7.1%
Angry 2.9%
Disgusted 2.6%
Confused 2.4%

AWS Rekognition

Age 22-30
Gender Male, 98%
Calm 44%
Sad 24.1%
Happy 18.2%
Confused 4.8%
Disgusted 4.1%
Fear 1.9%
Surprised 1.4%
Angry 1.4%

AWS Rekognition

Age 39-47
Gender Male, 82.2%
Happy 88.8%
Calm 7.4%
Surprised 1%
Disgusted 0.9%
Confused 0.7%
Sad 0.7%
Angry 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 98.7%
Person 98.7%
Person 98.6%
Person 98.3%
Person 98.1%
Person 96.9%
Person 95.9%
Person 93.5%
Person 91.8%
Chair 82.2%

Categories

Imagga

people portraits 96.7%
paintings art 1.2%

Text analysis

Amazon

PA
A
P
P A
9"

Google

PA
PA