Human Generated Data

Title

Untitled (remains on stretcher carried by two men)

Date

c.1970, from 1960 negative

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18756

Human Generated Data

Title

Untitled (remains on stretcher carried by two men)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1970, from 1960 negative

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18756

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.9
Human 98.9
Person 97.3
Automobile 95.3
Vehicle 95.3
Transportation 95.3
Person 91.3
Car 87.7
Person 83
Sports Car 70
Nature 61.4
Car Wash 60.3
Pedestrian 59.7
Parking Lot 57
Parking 57
Sedan 56.3

Clarifai
created on 2023-10-22

people 100
group together 99.2
adult 98.8
group 97.6
child 96.5
man 96.1
woman 95.8
administration 95.4
two 95.3
vehicle 94.8
home 93.2
four 92.3
three 89.8
several 89.3
war 88.2
five 87
recreation 86.5
leader 83.1
wear 82.3
boy 82

Imagga
created on 2022-03-05

barbershop 42.8
shop 33.6
chair 31.8
seat 26.4
mercantile establishment 25.9
stretcher 18.1
building 18
place of business 17.3
city 15.8
litter 15.4
people 15.1
street 14.7
barber chair 14.2
dark 14.2
urban 14
old 13.9
furniture 13.9
man 13.4
architecture 13.3
wall 12.6
conveyance 11.5
industrial 10.9
park 10.7
scene 10.4
person 10.3
dirty 9.9
transportation 9.9
bench 9.8
adult 9.4
road 9
room 9
establishment 8.8
empty 8.6
industry 8.5
travel 8.4
summer 8.4
window 8.2
light 8
business 7.9
male 7.8
house 7.5
outdoors 7.5
water 7.3
portrait 7.1
trees 7.1
night 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 96.3
text 93.9
black and white 84.6
clothing 68.6
person 66.5
old 61.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 54-64
Gender Male, 99.9%
Calm 67.8%
Surprised 10.7%
Confused 9.1%
Sad 4.9%
Disgusted 4.6%
Fear 1.5%
Angry 0.8%
Happy 0.6%

AWS Rekognition

Age 24-34
Gender Female, 85.7%
Sad 77.7%
Disgusted 11.3%
Calm 5.3%
Angry 2.1%
Happy 1.1%
Fear 1%
Confused 0.9%
Surprised 0.7%

AWS Rekognition

Age 23-31
Gender Male, 96.8%
Sad 98.5%
Calm 0.7%
Confused 0.3%
Fear 0.3%
Happy 0.1%
Surprised 0%
Angry 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Car
Person 98.9%
Person 97.3%
Person 91.3%
Person 83%
Car 87.7%

Text analysis

Amazon

72
Kentice