Human Generated Data

Title

Untitled (little league baseball players, with man in suit)

Date

1953

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18447

Human Generated Data

Title

Untitled (little league baseball players, with man in suit)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18447

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.7
Human 99.7
Person 98.9
Person 98.6
Person 97
Person 96.5
Person 95.1
Person 90.7
Clothing 90.5
Apparel 90.5
Helmet 89.9
Face 86.8
Crowd 86.3
Person 80.3
People 80
Musician 78.4
Musical Instrument 78.4
Person 78
Helmet 74.6
Costume 69.7
Outdoors 69.1
Shorts 66.5
Music Band 59.9
Nature 59.1
Festival 58.1
Paper 55.4

Clarifai
created on 2023-10-22

people 99.9
many 99.1
group 98.9
group together 98.7
adult 96
wear 94.8
man 94.2
music 92.9
leader 92.9
woman 92.9
veil 91.5
dancing 90
ceremony 89.7
administration 89.5
outfit 88
wedding 87.4
several 86.9
crowd 85.9
child 84.4
recreation 84

Imagga
created on 2022-03-04

people 25.1
megaphone 20.7
nurse 19.3
person 18.3
man 18.1
statue 17.4
acoustic device 16.7
groom 16.2
device 14.9
bride 14.4
adult 13.5
male 13.5
couple 13.1
wedding 12.9
dress 12.6
religion 12.5
happiness 11.8
photographer 11.7
marble 11.6
art 11.4
weapon 11.3
men 11.2
portrait 11
history 10.7
happy 10.6
old 10.4
monument 10.3
love 10.3
architecture 10.2
historic 10.1
girls 10
sword 10
brass 9.8
sculpture 9.7
celebration 9.6
clothing 9.5
mask 9.2
traditional 9.1
black 9
suit 9
human 9
decoration 8.8
ancient 8.6
culture 8.5
face 8.5
historical 8.5
travel 8.4
health 8.3
church 8.3
tradition 8.3
city 8.3
tourism 8.2
new 8.1
wind instrument 8.1
detail 8
family 8
women 7.9
worker 7.7
stone 7.7
two 7.6
musical instrument 7.6
marriage 7.6
room 7.5
religious 7.5
park 7.4
life 7.4
uniform 7.3
tourist 7.2
work 7.1
day 7.1
memorial 7
professional 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

person 99.1
text 95.1
clothing 93.2
black and white 92.9
man 82.1
sport 75.7
group 61.7
people 58.5
clothes 23.2
crowd 22.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 86.3%
Calm 99.8%
Sad 0.1%
Surprised 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 23-31
Gender Female, 99.3%
Calm 95.5%
Surprised 1.9%
Sad 0.6%
Fear 0.6%
Angry 0.5%
Disgusted 0.4%
Confused 0.3%
Happy 0.2%

AWS Rekognition

Age 30-40
Gender Male, 98.9%
Calm 98.1%
Happy 1%
Surprised 0.4%
Disgusted 0.1%
Confused 0.1%
Fear 0.1%
Sad 0.1%
Angry 0.1%

AWS Rekognition

Age 2-8
Gender Female, 99.5%
Fear 82.3%
Happy 4.4%
Calm 4.3%
Sad 4.1%
Surprised 2.8%
Angry 1.3%
Disgusted 0.6%
Confused 0.2%

AWS Rekognition

Age 23-31
Gender Female, 51.9%
Angry 30.1%
Sad 29.9%
Calm 27.4%
Happy 8.8%
Fear 1.2%
Disgusted 1%
Surprised 0.8%
Confused 0.6%

AWS Rekognition

Age 19-27
Gender Male, 76.4%
Calm 52.7%
Sad 40.1%
Fear 1.9%
Happy 1.6%
Disgusted 1.4%
Confused 1.2%
Surprised 0.6%
Angry 0.5%

AWS Rekognition

Age 31-41
Gender Male, 94.4%
Calm 44.5%
Sad 30.6%
Angry 6.8%
Happy 6.1%
Disgusted 4.9%
Surprised 4%
Fear 2.5%
Confused 0.7%

AWS Rekognition

Age 34-42
Gender Male, 95.2%
Calm 51.8%
Happy 37%
Fear 2.3%
Surprised 2.2%
Angry 2.1%
Sad 1.9%
Disgusted 1.6%
Confused 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Person 99.7%
Person 98.9%
Person 98.6%
Person 97%
Person 96.5%
Person 95.1%
Person 90.7%
Person 80.3%
Person 78%
Helmet 89.9%
Helmet 74.6%

Categories

Imagga

paintings art 96.4%
people portraits 3.1%

Text analysis

Amazon

SOX
eas
S

Google

YT37A°2- AGO eap SOX
YT37A°2-
AGO
eap
SOX