Human Generated Data

Title

Untitled (remains on stretcher carried by two men)

Date

1960

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18757

Human Generated Data

Title

Untitled (remains on stretcher carried by two men)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18757

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.7
Human 98.7
Person 98.2
Car 95.2
Automobile 95.2
Transportation 95.2
Vehicle 95.2
Person 92.1
Clothing 86.2
Apparel 86.2
Coat 68.6
Person 65.3
Sports Car 60.7
Nature 57.8
Sedan 57.6
Bus Stop 56.4

Clarifai
created on 2023-10-22

people 100
group together 98.9
adult 98.5
man 96.2
group 95.9
child 95.9
woman 95.1
vehicle 95.1
two 95.1
administration 93.3
four 89.8
several 89.6
three 88.8
recreation 87.6
war 87.1
five 86.6
monochrome 84.9
wear 84.8
home 83.4
many 81.3

Imagga
created on 2022-03-05

chair 36.1
seat 28.1
barbershop 18.3
barber chair 17.4
building 16.4
man 16.1
people 15.1
shop 14.3
dark 14.2
urban 14
furniture 13.6
architecture 12.5
city 12.5
water 12
mercantile establishment 11.4
person 11.3
street 11
transportation 10.8
night 10.7
scene 10.4
business 10.3
industrial 10
old 9.7
room 9.7
stretcher 9.6
adult 9.5
wall 9.3
travel 9.1
device 9.1
portrait 9.1
light 8.7
litter 8.7
empty 8.6
men 8.6
industry 8.5
male 8.5
house 8.4
silhouette 8.3
outdoors 8.2
corridor 7.9
place of business 7.8
sitting 7.7
outdoor 7.6
walk 7.6
speed 7.3
window 7.3
tourist 7.2
table 7.2
modern 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 92.8
text 92.3
clothing 75.9
person 73.2
black and white 72.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 99.9%
Calm 89.3%
Sad 3.2%
Confused 2.6%
Disgusted 1.9%
Surprised 1.7%
Fear 0.6%
Angry 0.5%
Happy 0.2%

AWS Rekognition

Age 6-14
Gender Male, 89.7%
Fear 80.6%
Angry 16.1%
Calm 1.1%
Sad 1%
Happy 0.6%
Disgusted 0.3%
Surprised 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person
Car
Person 98.7%
Person 98.2%
Person 92.1%
Person 65.3%
Car 95.2%

Text analysis

Amazon

TR
Sappa

Google

Anpon
Anpon