Human Generated Data

Title

Untitled (remains on stretcher, carried by two men)

Date

c.1970

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18762

Human Generated Data

Title

Untitled (remains on stretcher, carried by two men)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18762

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Person 98.6
Human 98.6
Person 98.2
Nature 97.4
Person 96.9
Outdoors 96.5
Person 94.8
Transportation 84.7
Vehicle 84.1
Automobile 83.3
Countryside 74.8
Clothing 67.9
Apparel 67.9
Car 65.4
Rural 60.2
Home Decor 57.8
Wheel 57
Machine 57
Hut 56.7
Building 56.7
Shack 55.8

Clarifai
created on 2023-10-22

people 99.9
group together 98.4
man 96.4
child 96.3
adult 95.4
street 94.9
vehicle 94.4
woman 94.3
group 92.9
two 91.9
three 89.7
police 88.6
transportation system 87.9
administration 86.9
snow 86.6
storm 85.8
boy 85.8
four 82
winter 81.4
home 80.4

Imagga
created on 2022-02-25

city 20.8
man 19.5
urban 17.5
sidewalk 17.4
people 16.2
street 15.6
walking 15.2
male 14.9
industrial 14.5
adult 13
industry 12.8
travel 12.7
person 12.4
outdoor 12.2
men 12
danger 11.8
building 11.5
walk 11.4
stretcher 11.2
beach 11
litter 10.1
ocean 10
dirty 9.9
sunset 9.9
activity 9.8
vacation 9.8
outdoors 9.8
business 9.7
sky 9.6
motion 9.4
water 9.3
conveyance 9.1
old 9.1
road 9
factory 9
snow 8.8
couple 8.7
crowd 8.6
day 8.6
sea 8.6
sport 8.6
sun 8
sand 8
working 8
child 7.9
work 7.8
accident 7.8
scene 7.8
cold 7.7
winter 7.7
dark 7.5
human 7.5
silhouette 7.4
shore 7.4
speed 7.3
group 7.3
transportation 7.2
world 7.1
architecture 7.1
wall 7.1

Google
created on 2022-02-25

Microsoft
created on 2022-02-25

ground 97.8
vehicle 96.1
outdoor 96
clothing 87.9
street 86.4
man 85.4
text 84.9
person 84.5
car 82.7
land vehicle 81.7
black and white 81.6
wheel 64.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 52-60
Gender Male, 100%
Calm 77.1%
Confused 13.5%
Disgusted 5.8%
Angry 1.6%
Happy 1%
Surprised 0.5%
Fear 0.2%
Sad 0.2%

AWS Rekognition

Age 45-51
Gender Female, 94.7%
Sad 60.8%
Fear 12.5%
Calm 9.6%
Disgusted 7.8%
Surprised 4.7%
Happy 1.9%
Angry 1.6%
Confused 1.1%

AWS Rekognition

Age 21-29
Gender Female, 50.7%
Calm 97.6%
Sad 1.1%
Angry 0.6%
Disgusted 0.2%
Confused 0.2%
Happy 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person
Car
Person 98.6%
Person 98.2%
Person 96.9%
Person 94.8%
Car 65.4%

Categories

Text analysis

Amazon

72
-