Human Generated Data

Title

Untitled (four men crouching in front of six dead geese)

Date

1951

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6368

Human Generated Data

Title

Untitled (four men crouching in front of six dead geese)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6368

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.5
Human 99.5
Person 99.5
Person 99.5
Person 98.8
Animal 89.3
Sea Life 87.9
Bird 86
Bird 84.8
Bird 83
Mammal 81.9
Bird 79.8
People 77.7
Bird 77.2
Apparel 69.5
Clothing 69.5
Bird 65.5
Military Uniform 63.9
Military 63.9
Fish 61.5
Fish 58.8
Fish 57.2

Clarifai
created on 2023-10-26

people 99.9
group 99.4
group together 97.6
veil 95.7
adult 95.4
many 95.3
wear 93
man 92.8
child 88.7
woman 87.8
military 85.2
boy 84.3
uniform 83.5
leader 83
administration 82.9
vehicle 82.8
outfit 82.5
lid 82.3
several 80.4
recreation 79.1

Imagga
created on 2022-01-22

sword 30.2
person 27.4
weapon 25.5
dancer 19.8
people 19.5
performer 18.8
art 17.6
black 17.4
man 16.8
male 16.3
dress 16.2
adult 15.7
fashion 14.3
dance 14.3
style 13.3
costume 13
entertainer 12.6
dark 12.5
clothing 11.2
old 11.1
motion 11.1
pose 10.9
statue 10.6
portrait 10.3
singer 10.2
posing 9.8
body 9.6
sport 9.3
event 9.2
attractive 9.1
exercise 9.1
cool 8.9
ancient 8.6
wall 8.5
face 8.5
grunge 8.5
musician 8.5
clothes 8.4
human 8.2
sexy 8
lifestyle 7.9
model 7.8
horror 7.8
men 7.7
performance 7.6
power 7.5
player 7.5
group 7.2
mask 7.2
stylish 7.2
team 7.2
history 7.1
sculpture 7
modern 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.3
person 97.5
outdoor 93.5
clothing 90.8
man 75.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 90%
Sad 83.2%
Calm 15.2%
Confused 0.4%
Happy 0.4%
Fear 0.3%
Angry 0.3%
Disgusted 0.2%
Surprised 0.1%

AWS Rekognition

Age 45-51
Gender Male, 96.6%
Calm 99.2%
Sad 0.4%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0%
Happy 0%
Confused 0%

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Happy 42.3%
Calm 32.4%
Sad 7.3%
Angry 7.1%
Disgusted 4.3%
Surprised 2.9%
Confused 2.1%
Fear 1.6%

AWS Rekognition

Age 35-43
Gender Male, 99.9%
Calm 79.7%
Confused 7.9%
Angry 5%
Sad 4.9%
Disgusted 1.5%
Surprised 0.5%
Fear 0.3%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Bird 86%
Fish 61.5%

Categories

Imagga

paintings art 90.8%
people portraits 6.7%