Human Generated Data

Title

Untitled (minstrel show performance)

Date

January 14, 1950

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18055

Human Generated Data

Title

Untitled (minstrel show performance)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

January 14, 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18055

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.6
Human 99.6
Interior Design 99.5
Indoors 99.5
Person 99.5
Person 99.4
Person 99.4
Person 99.1
Person 98.7
Person 98.6
Person 98.6
Audience 97.6
Crowd 97.6
Person 96.9
Person 96.9
Person 96.6
Person 96
Person 95.5
Person 95.2
Stage 93.7
Room 92.9
Person 91.8
Person 91.6
Person 90.4
Person 89.6
Person 89
Person 87.4
Person 87
Speech 71.7
Performer 70.8
Theater 69.8
People 65.8
Clothing 64.3
Apparel 64.3
Musician 64
Musical Instrument 64
Suit 62.7
Coat 62.7
Overcoat 62.7
Person 50.9

Clarifai
created on 2023-10-29

people 99.9
many 99.3
group 99.3
leader 96.3
woman 95.3
adult 95
man 94
group together 93.4
administration 92.8
crowd 90.8
audience 86.3
music 85.6
ceremony 85.2
education 81.5
chair 80.3
spectator 77.7
war 75
child 74.4
meeting 73.3
funeral 73.3

Imagga
created on 2022-03-04

building 31
shop 28.1
architecture 28.1
mercantile establishment 20.4
case 19.9
barbershop 19.7
city 17.4
tourism 16.5
old 16
history 15.2
historical 15
monument 14.9
travel 14.8
historic 14.7
people 14.5
palace 14.4
crowd 14.4
counter 13.9
structure 13.8
place of business 13.5
art 13.2
sculpture 13
culture 12.8
landmark 12.6
boutique 12.4
group 12.1
statue 11.6
black 11.4
urban 11.3
ancient 11.2
house 10.9
facade 10.8
business 10.3
famous 10.2
column 9.9
temple 9.5
men 9.4
light 9.3
traditional 9.1
design 9
scene 8.6
wall 8.5
balcony 8.4
vintage 8.4
tradition 8.3
inside 8.3
silhouette 8.3
spectator 8.2
stall 7.7
roof 7.6
destination 7.5
church 7.4
window 7.3
museum 7.3
detail 7.2
man 7.2
religion 7.2
night 7.1

Google
created on 2022-03-04

Black 89.8
Standing 86.4
Coat 86.1
Style 83.8
Suit 80.4
Crowd 76.1
Font 75.7
Snapshot 74.3
Event 74.1
Stage equipment 73.1
Art 72.2
Team 69.6
Rectangle 68.6
Crew 67.9
Monochrome 67.5
Room 66
Wheel 63.9
Stock photography 63
History 62.4
Metal 61.2

Microsoft
created on 2022-03-04

person 99.1
clothing 97.4
text 93.9
man 93.4
black and white 80
group 65.4
line 22.8
crowd 0.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 18-26
Gender Female, 85.5%
Angry 41.5%
Sad 17.1%
Fear 12.3%
Happy 9.7%
Disgusted 7.6%
Calm 5.2%
Surprised 3.9%
Confused 2.8%

AWS Rekognition

Age 24-34
Gender Female, 98.3%
Sad 72.3%
Calm 14.6%
Fear 3.3%
Angry 3.2%
Disgusted 2.4%
Surprised 1.6%
Confused 1.6%
Happy 1%

AWS Rekognition

Age 41-49
Gender Male, 79.6%
Calm 50.3%
Sad 17.4%
Angry 16.9%
Fear 10.5%
Disgusted 2.5%
Happy 1.7%
Confused 0.4%
Surprised 0.3%

AWS Rekognition

Age 54-62
Gender Male, 89.1%
Calm 92.3%
Sad 4.1%
Angry 1.3%
Confused 1.2%
Surprised 0.4%
Disgusted 0.3%
Fear 0.2%
Happy 0.2%

AWS Rekognition

Age 31-41
Gender Female, 63.5%
Calm 39.8%
Happy 39.1%
Sad 10%
Surprised 5.9%
Angry 2%
Confused 1.3%
Fear 0.9%
Disgusted 0.9%

AWS Rekognition

Age 29-39
Gender Male, 98.1%
Calm 86.5%
Happy 6.8%
Sad 2.4%
Confused 2.1%
Disgusted 1%
Angry 0.5%
Surprised 0.5%
Fear 0.3%

AWS Rekognition

Age 20-28
Gender Male, 98.7%
Sad 56.4%
Calm 20.4%
Fear 10.6%
Confused 6.3%
Disgusted 2.4%
Angry 2.3%
Surprised 1.1%
Happy 0.5%

AWS Rekognition

Age 29-39
Gender Male, 84.2%
Happy 66.5%
Surprised 16.4%
Disgusted 5.1%
Angry 4.2%
Fear 3.9%
Calm 2.1%
Sad 1.2%
Confused 0.6%

AWS Rekognition

Age 20-28
Gender Female, 50.5%
Sad 77.1%
Calm 7%
Confused 5.5%
Happy 3.9%
Fear 2.6%
Angry 2.4%
Disgusted 1%
Surprised 0.5%

AWS Rekognition

Age 20-28
Gender Female, 96.2%
Calm 95.3%
Sad 2.5%
Happy 0.6%
Angry 0.5%
Fear 0.3%
Disgusted 0.3%
Confused 0.3%
Surprised 0.2%

AWS Rekognition

Age 20-28
Gender Male, 96%
Sad 49.1%
Calm 29.8%
Happy 10.4%
Confused 5%
Disgusted 1.9%
Fear 1.6%
Surprised 1.3%
Angry 0.9%

AWS Rekognition

Age 22-30
Gender Male, 93.5%
Calm 60%
Sad 21.1%
Happy 13.7%
Fear 1.7%
Angry 1.2%
Disgusted 0.9%
Confused 0.8%
Surprised 0.6%

AWS Rekognition

Age 1-7
Gender Female, 76.6%
Calm 95.4%
Sad 2.3%
Confused 1%
Happy 0.4%
Disgusted 0.3%
Angry 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 27-37
Gender Male, 76.5%
Calm 67.2%
Happy 19.3%
Angry 3.6%
Disgusted 2.7%
Sad 2.6%
Surprised 2.5%
Confused 1.8%
Fear 0.3%

AWS Rekognition

Age 20-28
Gender Male, 76%
Sad 66.2%
Angry 16.6%
Confused 9.4%
Calm 5%
Disgusted 1.3%
Surprised 0.5%
Fear 0.5%
Happy 0.5%

AWS Rekognition

Age 21-29
Gender Male, 85.1%
Calm 89.8%
Sad 5.2%
Angry 1.6%
Happy 1.4%
Fear 0.7%
Confused 0.6%
Disgusted 0.4%
Surprised 0.3%

AWS Rekognition

Age 13-21
Gender Female, 50.2%
Sad 40.9%
Calm 29.2%
Confused 11.3%
Happy 7.9%
Angry 4%
Disgusted 3.4%
Surprised 2%
Fear 1.4%

AWS Rekognition

Age 21-29
Gender Female, 99.2%
Calm 96.1%
Sad 2.1%
Happy 0.7%
Angry 0.5%
Confused 0.2%
Surprised 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-33
Gender Male, 96.4%
Calm 85.5%
Sad 8%
Happy 2.6%
Confused 1.2%
Fear 1.1%
Disgusted 0.7%
Angry 0.5%
Surprised 0.4%

AWS Rekognition

Age 45-51
Gender Male, 80%
Calm 44.1%
Confused 18.2%
Sad 15.9%
Disgusted 7.9%
Happy 5.5%
Surprised 4.1%
Fear 3.3%
Angry 1%

AWS Rekognition

Age 19-27
Gender Male, 72.1%
Calm 97.6%
Happy 0.8%
Sad 0.8%
Confused 0.3%
Disgusted 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person
Person 99.6%
Person 99.5%
Person 99.4%
Person 99.4%
Person 99.1%
Person 98.7%
Person 98.6%
Person 98.6%
Person 96.9%
Person 96.9%
Person 96.6%
Person 96%
Person 95.5%
Person 95.2%
Person 91.8%
Person 91.6%
Person 90.4%
Person 89.6%
Person 89%
Person 87.4%
Person 87%
Person 50.9%

Categories

Text analysis

Amazon

EE
NAGOY
YT3RA3 NAGOY
YT3RA3