Human Generated Data

Title

Untitled (men standing in remains of a building fire)

Date

1957

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18667

Human Generated Data

Title

Untitled (men standing in remains of a building fire)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18667

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.6
Human 99.6
Person 99.5
Person 99.4
Person 98.8
Nature 95.8
Outdoors 92.7
Clothing 92.3
Apparel 92.3
Soil 90.1
Ground 88.9
Shelter 87.8
Building 87.8
Rural 87.8
Countryside 87.8
People 70
Standing 67.8
Archaeology 60.8
Tree 58.3
Plant 58.3
Camping 56.2
Female 55.8

Clarifai
created on 2023-10-23

people 99.8
group together 98.8
adult 98.8
group 98.7
man 97
war 95.2
child 95.2
administration 94.4
woman 92
soldier 90.5
many 90.2
interaction 89.7
three 89.6
home 89.1
skirmish 88.4
military 87.4
several 86.1
police 85.2
monochrome 83.7
four 82

Imagga
created on 2022-03-05

cemetery 20.9
sand 18.6
sky 17.9
man 17.5
landscape 17.1
water 16.7
rock 16.5
travel 15.5
people 15.1
sunset 14.4
park 14
outdoors 13.7
sun 13.7
old 13.2
vacation 13.1
beach 12.9
earth 12.8
rustic 12.5
stone 12.5
person 12.4
danger 11.8
tree 11.8
desert 11.3
two 11
summer 10.9
vintage 10.8
tourism 10.7
male 10.7
outdoor 10.7
structure 10.7
grunge 10.2
sea 10.2
ocean 10
mountain 9.8
black 9.6
forest 9.6
light 9.6
gravestone 9.5
love 9.5
child 9.3
adult 9.1
texture 9
river 8.9
country 8.8
couple 8.7
memorial 8.7
art 8.5
silhouette 8.3
peaceful 8.2
protection 8.2
sunlight 8
antique 7.8
men 7.7
walking 7.6
kin 7.5
dry 7.4
dirty 7.2
color 7.2
wall 7.2
recreation 7.2
life 7.1
trees 7.1
world 7.1
scenic 7

Google
created on 2022-03-05

Adaptation 79.3
Tints and shades 76.7
Art 75.7
Tree 75.4
Monochrome photography 73.7
Monochrome 73.4
Vintage clothing 70.1
Grass 69.1
Visual arts 68.4
Landscape 68.2
Event 67.5
Plant 67.2
History 66.4
Font 62.8
Soil 61.5
Room 60.9
Photographic paper 56
Hat 54.3
Picture frame 52.8

Microsoft
created on 2022-03-05

outdoor 97.5
ground 97.1
person 93.9
clothing 93.1
text 91.3
man 87.8
black and white 80.4
old 72.3
black 71.8
white 66.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 65.1%
Calm 99.2%
Confused 0.4%
Sad 0.1%
Happy 0.1%
Disgusted 0.1%
Surprised 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 45-53
Gender Male, 100%
Calm 89.5%
Sad 5.9%
Happy 1.8%
Angry 0.9%
Confused 0.9%
Disgusted 0.4%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 20-28
Gender Male, 54.8%
Calm 86.9%
Happy 8.4%
Sad 2.3%
Fear 0.9%
Confused 0.6%
Disgusted 0.5%
Angry 0.3%
Surprised 0.2%

AWS Rekognition

Age 45-51
Gender Male, 100%
Calm 99.1%
Sad 0.4%
Confused 0.2%
Angry 0.1%
Happy 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.6%
Person 99.5%
Person 99.4%
Person 98.8%

Categories

Imagga

paintings art 100%