Human Generated Data

Title

Untitled (little league baseball players, man signing autographs)

Date

1953

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18446

Human Generated Data

Title

Untitled (little league baseball players, man signing autographs)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18446

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Helmet 99.9
Clothing 99.9
Apparel 99.9
Person 99.5
Human 99.5
Person 99
Helmet 96.8
Person 96.4
Person 95.4
Person 94.9
Helmet 91.3
Person 91.1
Person 90.5
Person 90.3
Shoe 89.4
Footwear 89.4
Person 81.8
Helmet 80.3
Coat 67.7
Helmet 66.7
Crowd 62.1
People 58.4
Person 42.5

Clarifai
created on 2023-10-22

people 99.6
group together 98.4
many 98.2
adult 95.7
group 95.7
wear 95.7
man 95
woman 93.1
outfit 91.1
monochrome 90.7
crowd 89.1
retro 87.1
recreation 86.4
music 84.7
child 78.8
indoors 77.2
enjoyment 75.7
veil 75.4
dancing 74.8
uniform 73.7

Imagga
created on 2022-03-04

person 27.6
man 24.2
people 22.3
male 19.8
helmet 16.7
photographer 15.3
adult 14.5
brass 12.8
human 12.7
sport 11.7
black 11.4
work 11
protection 10.9
wind instrument 10.9
toyshop 10.9
clothing 10.5
studio 9.9
group 9.7
performance 9.6
professional 9.5
women 9.5
men 9.4
costume 9.4
shop 9.3
dance 9.2
equipment 9
mask 9
team 9
handsome 8.9
art 8.8
happy 8.8
fight 8.7
war 8.6
business 8.5
musical instrument 8.4
music 8.4
dark 8.3
fashion 8.3
weapon 8.2
armor 8.1
game 8
light 8
couple 7.8
sword 7.8
portrait 7.8
shoot 7.7
death 7.7
style 7.4
street 7.4
performer 7.2
metal 7.2
dress 7.2
religion 7.2
smile 7.1
worker 7.1
instrument 7.1
job 7.1
medical 7.1
medicine 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 99.3
clothing 97.4
person 93.6
man 85.1
black and white 82.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 91.6%
Calm 62%
Sad 21.8%
Confused 7.8%
Surprised 3.3%
Disgusted 2.1%
Angry 1.5%
Fear 0.9%
Happy 0.7%

AWS Rekognition

Age 35-43
Gender Male, 97.7%
Calm 66.4%
Sad 17.7%
Happy 6.1%
Fear 2.7%
Surprised 2.3%
Confused 1.9%
Angry 1.6%
Disgusted 1.3%

AWS Rekognition

Age 45-53
Gender Female, 65.8%
Calm 91%
Sad 3.8%
Happy 2.4%
Angry 0.9%
Surprised 0.6%
Confused 0.5%
Fear 0.4%
Disgusted 0.4%

AWS Rekognition

Age 41-49
Gender Male, 77.2%
Calm 55.5%
Sad 35.2%
Happy 4.1%
Fear 1.5%
Disgusted 1.3%
Angry 1.2%
Surprised 0.9%
Confused 0.3%

AWS Rekognition

Age 23-31
Gender Male, 68.8%
Happy 66%
Calm 10.3%
Sad 8.5%
Fear 7.4%
Surprised 2.8%
Angry 2.5%
Disgusted 1.4%
Confused 1.2%

AWS Rekognition

Age 23-33
Gender Female, 86.3%
Calm 52.5%
Fear 20.6%
Happy 7.8%
Surprised 6.4%
Angry 6%
Sad 3.9%
Disgusted 2%
Confused 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Helmet
Person
Shoe
Helmet 99.9%
Helmet 96.8%
Helmet 91.3%
Helmet 80.3%
Helmet 66.7%
Person 99.5%
Person 99%
Person 96.4%
Person 95.4%
Person 94.9%
Person 91.1%
Person 90.5%
Person 90.3%
Person 81.8%
Person 42.5%
Shoe 89.4%

Categories

Imagga

paintings art 93.9%
people portraits 4.4%

Text analysis

Amazon

RE
G
AN
S
r
GIAN'S
NAGON
eap
YT37A°2- NAGON
PLAYED
YT37A°2-

Google

eap YT3RA°2-NAGO RE CIANIS
eap
YT3RA°2-NAGO
RE
CIANIS