Human Generated Data

Title

Untitled (children with fishing nets standing on dock, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.187.2

Human Generated Data

Title

Untitled (children with fishing nets standing on dock, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.187.2

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.3
Human 99.3
Person 99
Person 98.8
Clothing 98
Apparel 98
Face 92.1
Play 91.4
Chair 87.2
Furniture 87.2
Kid 79.6
Child 79.6
Dress 79.4
Female 77.4
Shorts 75.8
Person 72.3
Girl 70.6
Portrait 68.1
Photography 68.1
Photo 68.1
Clock Tower 66.6
Building 66.6
Architecture 66.6
Tower 66.6
People 63.3
Floor 62.2
Costume 60.8
Pants 58.9
Shoe 58.4
Footwear 58.4
Baby 56.4

Clarifai
created on 2023-10-22

people 99.6
child 96.5
woman 95.4
group 95.2
man 94.8
adult 94.4
street 93.9
monochrome 93.3
group together 92.2
boy 92
girl 89.6
portrait 88.2
family 87.2
art 85.5
room 82.6
wedding 82.1
son 81.2
actor 80.6
two 80.2
recreation 78.3

Imagga
created on 2021-12-14

grandfather 25
man 24.2
grandma 22.9
people 21.7
old 18.8
person 18.2
statue 17.1
male 16.3
happy 15.7
kin 15
nurse 14.6
adult 14.5
senior 13.1
portrait 12.9
home 12.8
room 12.5
hospital 12.3
couple 12.2
smile 12.1
sculpture 11.5
elderly 11.5
medical 11.5
patient 11.4
child 11.3
happiness 11
world 10.7
architecture 10.1
smiling 10.1
surgeon 9.9
art 9.8
family 9.8
surgery 9.8
health 9.7
mask 9.7
retired 9.7
illness 9.5
ancient 9.5
doctor 9.4
monument 9.3
camera 9.2
hand 9.1
history 8.9
mother 8.8
love 8.7
married 8.6
head 8.4
care 8.2
aged 8.1
religion 8.1
medicine 7.9
black 7.8
marble 7.7
men 7.7
bride 7.7
parent 7.5
house 7.5
traditional 7.5
worker 7.5
father 7.3
decoration 7.2
dress 7.2
hair 7.1
women 7.1
face 7.1
work 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

person 97.7
text 97.5
clothing 91
black and white 85.8
man 85.2
human face 77.3
standing 76.9
posing 60.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Female, 94.4%
Calm 89.6%
Sad 4.7%
Happy 2.8%
Confused 1.4%
Disgusted 0.4%
Angry 0.4%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 7-17
Gender Male, 70%
Calm 85.5%
Sad 4.6%
Surprised 4%
Happy 2.1%
Angry 1.3%
Confused 1.2%
Disgusted 0.8%
Fear 0.6%

AWS Rekognition

Age 29-45
Gender Female, 57.2%
Fear 63.4%
Angry 27.7%
Sad 3.4%
Surprised 1.6%
Happy 1.3%
Confused 1%
Calm 0.9%
Disgusted 0.6%

AWS Rekognition

Age 23-37
Gender Male, 55.2%
Calm 82.5%
Happy 9.9%
Sad 3.5%
Angry 1.7%
Confused 1%
Fear 0.5%
Surprised 0.5%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Clock Tower 66.6%

Categories

Imagga

people portraits 56.5%
paintings art 41.4%