Human Generated Data

Title

Untitled (group of African American women holding pineapples)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2683

Human Generated Data

Title

Untitled (group of African American women holding pineapples)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2683

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.7
Human 99.7
Person 99.2
Person 99
Person 99
Person 98.5
Person 97.7
Clothing 97.3
Apparel 97.3
Person 95.2
Person 93.1
Person 93
Dress 91.5
Shelter 88.5
Building 88.5
Countryside 88.5
Rural 88.5
Nature 88.5
Outdoors 88.5
People 87.6
Female 79.2
Face 78.1
Person 73.6
Woman 64.1
Portrait 63.6
Photography 63.6
Photo 63.6
Girl 60.2

Clarifai
created on 2023-10-26

people 99.9
group 99.7
child 97.3
dress 96.1
adult 95.6
music 94.7
wear 94.4
musician 93
group together 92.8
man 92.4
woman 92.1
many 91.4
boy 90
school 88.9
family 85.2
singer 85.1
education 83.9
teacher 83.3
wedding 82.8
actor 82.8

Imagga
created on 2022-01-16

kin 43.8
life 19.1
old 17.4
grunge 16.2
people 16.2
grandfather 13.8
child 13.6
landscape 13.4
male 12.9
man 12.8
vintage 12.4
park 12.3
rural 12.3
building 12.1
texture 11.8
field 11.7
tree 11.5
outdoors 11.2
space 10.9
outdoor 10.7
forest 10.4
antique 10.4
grass 10.3
black 10.2
snow 10.1
structure 9.8
trees 9.8
country 9.6
sky 9.6
cold 9.5
winter 9.4
person 9.3
adult 9.2
silhouette 9.1
dirty 9
dad 8.7
scene 8.6
grungy 8.5
frame 8.3
aged 8.1
horizon 8.1
water 8
fence 7.9
couple 7.8
art 7.8
architecture 7.8
empty 7.7
blank 7.7
picket fence 7.6
house 7.5
world 7.5
father 7.4
style 7.4
musical instrument 7.4
grain 7.4
paint 7.2
scenic 7
season 7

Google
created on 2022-01-16

Microsoft
created on 2022-01-16

text 96.9
grass 96.4
clothing 94
outdoor 90.2
person 86
old 79.4
posing 74.9
woman 71.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 4-12
Gender Male, 87%
Fear 38.2%
Confused 26.1%
Surprised 19.2%
Calm 11%
Sad 3.1%
Angry 1%
Disgusted 0.9%
Happy 0.4%

AWS Rekognition

Age 19-27
Gender Female, 95.7%
Calm 86.2%
Sad 5.8%
Happy 4.5%
Confused 1.4%
Fear 0.6%
Disgusted 0.6%
Angry 0.5%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

paintings art 99.2%

Text analysis

Amazon

ОДЛ

Google

NAGO-YT37A2-MAMT2A3
NAGO-YT37A2-MAMT2A3