Human Generated Data

Title

Untitled (students next to school buses)

Date

1959

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1599

Human Generated Data

Title

Untitled (students next to school buses)

People

Artist: John Deusing, American active 1940s

Date

1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1599

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Clothing 99.8
Apparel 99.8
Person 99.7
Human 99.7
Person 99.6
Person 96.9
Transportation 95.7
Bus 95.7
Vehicle 95.7
Person 95.1
Person 92.5
Fashion 90.2
Robe 90.2
Gown 86
Wedding 83.7
Nature 81.5
Person 81.4
Person 81
Pedestrian 78.3
Person 77.7
Person 77.3
Person 76.3
Wedding Gown 74.7
Person 73.9
Female 73.9
Outdoors 68.7
Bridegroom 67
Person 66.3
Person 65.3
People 63
Evening Dress 62.9
Meal 61.1
Food 61.1
Crowd 59.6
Bride 59.6
Person 59.5
Officer 57.6
Military 57.6
Military Uniform 57.6
Face 57.5
Weather 57.1
Woman 56

Clarifai
created on 2023-10-15

people 99
monochrome 94.9
bride 92.9
vehicle 91
woman 90.8
adult 90.4
man 90.2
street 89.5
group 88.4
transportation system 88.1
many 87.5
wedding 85.4
group together 82.3
car 79.4
ceremony 76.9
crowd 75.1
groom 73.8
black and white 73.4
veil 67.4
chair 64.2

Imagga
created on 2021-12-14

fountain 52
groom 40.2
structure 37.9
negative 34.9
film 28
city 21.6
photographic paper 19.4
world 18
travel 17.6
park 17.3
snow 17.2
architecture 16.4
sky 15.9
building 15.2
outdoor 14.5
old 13.9
bride 13.4
water 13.3
urban 13.1
photographic equipment 13
dress 12.7
river 12.5
tourism 12.4
statue 12.3
couple 12.2
happiness 11.8
people 11.7
color 11.7
history 11.6
famous 11.2
wedding 11
landmark 10.8
person 10.7
scene 10.4
love 10.3
black 10.2
street 10.1
palace 9.6
sculpture 9.6
man 9.4
historical 9.4
two 9.3
tree 9.2
car 9.2
vacation 9
transportation 9
landscape 8.9
mask 8.6
married 8.6
adult 8.4
summer 8.4
house 8.4
dark 8.4
transport 8.2
industrial 8.2
road 8.1
light 8.1
sun 8.1
weather 7.9
stone 7.8
art 7.8
motion 7.7
culture 7.7
power 7.6
traditional 7.5
monument 7.5
column 7.5
smoke 7.4
environment 7.4
celebration 7.2
portrait 7.1
women 7.1
night 7.1
day 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98
outdoor 93.2
wedding 71.6
clothing 66.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-57
Gender Female, 78.6%
Calm 91.4%
Sad 7.3%
Angry 0.4%
Happy 0.3%
Confused 0.2%
Fear 0.2%
Disgusted 0.1%
Surprised 0%

AWS Rekognition

Age 8-18
Gender Female, 61.2%
Calm 77.4%
Sad 16.1%
Happy 5%
Angry 0.6%
Confused 0.4%
Fear 0.2%
Surprised 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.7%
Bus 95.7%

Text analysis

Amazon

BUS
CHARTERED
34
DOOR
EMERGENCY DOOR
EMERGENCY
Two
XPI
TRANSPORTATION CO
MUS

Google

EHERGENCY DOOR CHARTERED -YT3RA°2--XAGO>
EHERGENCY
DOOR
CHARTERED
-YT3RA°2--XAGO>