Human Generated Data

Title

Untitled (two portraits of five African American children in studio)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3084

Human Generated Data

Title

Untitled (two portraits of five African American children in studio)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3084

Machine Generated Data

Tags

Amazon
created on 2022-01-21

Furniture 99.1
Person 98.5
Human 98.5
Chair 98.1
Person 97.7
Person 97.6
Clothing 96.9
Apparel 96.9
Person 96.3
Person 95.4
Person 92.7
Person 83.5
Suit 80.1
Overcoat 80.1
Coat 80.1
People 77.2
Person 72.8
Shorts 72.6
Kid 67.4
Child 67.4
Baby 67
Indoors 64.8
Toy 64.2
Portrait 61.3
Face 61.3
Photography 61.3
Photo 61.3
Shoe 59.2
Footwear 59.2
Room 57.9
Couch 56.9
Dress 56.9
Person 56.8
Figurine 56.6
Sitting 55.5

Clarifai
created on 2023-10-26

people 99.8
group 99.6
child 99.3
man 98.5
education 97.5
nostalgia 96.2
monochrome 95.6
science 94.6
woman 94.1
family 94
several 93.5
outfit 93.2
adult 92.7
wear 92.6
retro 91.7
son 91.7
actor 91.4
indoors 85.8
group together 85.7
doll 85.3

Imagga
created on 2022-01-21

negative 55.1
brass 44.4
film 44.4
photographic paper 33.6
wind instrument 32.8
musical instrument 22.7
photographic equipment 22.4
people 22.3
person 16.6
man 14.3
male 14.2
kin 13.5
silhouette 13.2
human 12.7
team 12.5
men 12
design 11.2
cornet 11.2
adult 11.2
party 10.3
art 10.2
decoration 10.2
teamwork 10.2
business 9.7
group 9.7
celebration 9.6
toyshop 9.5
symbol 9.4
culture 9.4
holiday 9.3
face 9.2
traditional 9.1
holding 9.1
sport 9.1
portrait 9.1
black 9
child 8.9
bugle 8.8
figure 8.8
world 8.7
costume 8.6
friendship 8.4
shop 8.4
happy 8.1
dress 8.1
body 8
crowd 7.7
head 7.6
reflection 7.5
event 7.4
sculpture 7.3
color 7.2
lifestyle 7.2
ball 7.2
history 7.1
bright 7.1
paper 7.1
travel 7
sky 7

Google
created on 2022-01-21

Microsoft
created on 2022-01-21

person 99
clothing 88.4
text 85.8
standing 84.9
posing 81.3
group 69.4
people 63.8
statue 58
old 54.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 7-17
Gender Female, 98.2%
Calm 95.3%
Sad 3.7%
Surprised 0.4%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 6-16
Gender Female, 62.4%
Calm 89.1%
Sad 4.9%
Surprised 4%
Angry 0.6%
Confused 0.4%
Disgusted 0.4%
Happy 0.3%
Fear 0.2%

AWS Rekognition

Age 23-31
Gender Male, 99.4%
Calm 92.4%
Sad 5.5%
Surprised 1%
Disgusted 0.4%
Confused 0.2%
Angry 0.2%
Happy 0.2%
Fear 0.2%

AWS Rekognition

Age 27-37
Gender Female, 97.3%
Calm 97.1%
Surprised 1.2%
Sad 0.9%
Happy 0.2%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 2-10
Gender Male, 55.2%
Calm 64.2%
Surprised 16.8%
Angry 7.2%
Sad 5.4%
Disgusted 3%
Fear 1.5%
Happy 1%
Confused 0.9%

AWS Rekognition

Age 14-22
Gender Female, 93.9%
Calm 80.7%
Surprised 8.1%
Happy 7.3%
Disgusted 1.2%
Angry 1%
Confused 0.8%
Sad 0.6%
Fear 0.4%

AWS Rekognition

Age 20-28
Gender Female, 96.9%
Calm 94.6%
Sad 4.7%
Surprised 0.2%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%
Happy 0%
Fear 0%

AWS Rekognition

Age 23-31
Gender Male, 96.7%
Calm 70.1%
Surprised 15%
Happy 10.7%
Sad 1.9%
Fear 0.7%
Confused 0.7%
Disgusted 0.6%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Shoe 59.2%

Categories

Imagga

paintings art 64.6%
text visuals 25.5%
people portraits 8.4%

Text analysis

Amazon

so