Human Generated Data

Title

Untitled (injured women and car accident)

Date

1957

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18687.2

Human Generated Data

Title

Untitled (injured women and car accident)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18687.2

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Person 99.5
Human 99.5
Person 99.3
Person 98.2
Person 96.2
Person 93.3
Car 91.2
Automobile 91.2
Vehicle 91.2
Transportation 91.2
Outdoors 90.4
Person 89.9
Person 88.5
Nature 86.3
Car 85.9
Person 84.3
Person 81.3
Person 81
Person 80.5
Poster 73.9
Advertisement 73.9
Machine 70.1
Wheel 70.1
Snow 66.2
Winter 59
Person 53.7
Car 50.4

Clarifai
created on 2021-04-03

people 99.6
street 97.3
group 96.7
snow 94.4
adult 94.4
man 93.9
group together 93.8
woman 93.4
two 92.4
portrait 88.1
child 87.5
winter 87.5
monochrome 85.7
girl 84.7
three 84.4
administration 81.6
family 80.1
landscape 79.6
many 79.4
vehicle 79.2

Imagga
created on 2021-04-03

billboard 73
signboard 59.3
structure 44.1
sky 22.3
city 14.1
outdoors 13.5
people 12.8
travel 12.7
outdoor 12.2
black 10.8
person 10.7
park 10.7
vacation 10.6
urban 10.5
fun 10.5
day 10.2
building 10.1
street 10.1
man 10.1
male 9.9
landscape 9.7
summer 9.6
outside 9.4
architecture 9.4
clouds 9.3
sport 9.2
silhouette 9.1
old 9.1
activity 9
tree 8.8
water 8.7
lifestyle 8.7
beach 8.4
resort 8.4
window 8.4
relaxation 8.4
color 8.3
television 8.2
recreation 8.1
statue 8
trees 8
lamp 7.9
boy 7.8
art 7.8
play 7.7
jumping 7.7
pole 7.7
frame 7.5
leisure 7.5
one 7.5
town 7.4
light 7.3
island 7.3
equipment 7.3
mountain 7.1
wall 7.1

Google
created on 2021-04-03

Wheel 96.8
Photograph 94.4
White 92.2
Tire 90.9
Sky 90.6
Black 90.2
Black-and-white 85.5
Gesture 85.3
Style 83.9
Car 81.7
Vehicle 81.5
Adaptation 79.4
People 77.7
Monochrome photography 77.2
Tints and shades 77.1
Snapshot 74.3
Monochrome 74
Art 73.1
Font 68.7
Travel 68.4

Microsoft
created on 2021-04-03

text 94.8
person 91.3
image 37.1
picture frame 10.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-33
Gender Female, 62.5%
Sad 86.1%
Calm 12.4%
Happy 0.6%
Angry 0.4%
Fear 0.3%
Confused 0.1%
Surprised 0.1%
Disgusted 0%

AWS Rekognition

Age 24-38
Gender Female, 65.9%
Sad 78.8%
Calm 17.4%
Happy 2.5%
Confused 0.6%
Angry 0.3%
Surprised 0.3%
Fear 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Car 91.2%
Poster 73.9%
Wheel 70.1%

Categories

Text analysis

Amazon

G10