Human Generated Data

Title

Untitled (Boy Scouts on back of train engine)

Date

1955

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18588

Human Generated Data

Title

Untitled (Boy Scouts on back of train engine)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18588

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.5
Human 99.5
Person 98.8
Person 98.6
Person 97.4
Person 96.4
Person 94.9
Train 92.3
Transportation 92.3
Vehicle 92.3
Nature 89.5
Outdoors 86.5
Person 84.9
Home Decor 84.8
Ice 80.4
Face 80.1
Water 76.7
Crowd 74.5
People 73.5
Path 68.9
Person 64
Sea 63
Ocean 63
Pedestrian 62.7
Clothing 60.3
Apparel 60.3
Tree 60.2
Plant 60.2
Sailor Suit 59.9
Photography 57.4
Photo 57.4
Vacation 55.8
Shoreline 55.7
Person 46.4

Clarifai
created on 2023-10-22

people 99.9
group 98.9
group together 98.7
adult 97.3
vehicle 97.3
transportation system 96.9
many 96.8
man 94.9
monochrome 94.3
watercraft 92
several 89.3
train 85.5
railway 85.4
woman 83.4
wear 82.2
military 78.9
crowd 77.3
hospital 76.9
medical practitioner 76.5
administration 75.3

Imagga
created on 2022-03-05

city 26.6
architecture 25.8
building 23.8
snow 22.2
travel 21.8
urban 17.5
station 17.1
transportation 17
train 16.7
transport 16.4
wheeled vehicle 16.4
river 16
old 15.3
car 14.8
winter 14.5
bridge 14.3
sketch 13.5
water 13.3
ship 13.3
cold 12.9
sky 12.7
industrial 12.7
weather 11.8
tower 11.6
tourism 11.5
vehicle 11.5
construction 11.1
landmark 10.8
drawing 10.7
landscape 10.4
cityscape 10.4
house 9.7
industry 9.4
season 9.3
town 9.3
street 9.2
history 8.9
passenger car 8.9
vessel 8.7
port 8.7
buildings 8.5
outdoor 8.4
boat 8.4
famous 8.4
vintage 8.3
wagon 8.2
container 8
light 8
representation 8
factory 7.7
wall 7.7
equipment 7.7
power 7.6
destination 7.5
outdoors 7.5
smoke 7.4
passenger 7.4
exterior 7.4
vacation 7.4
business 7.3
road 7.2
night 7.1
stone 7.1
sea 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98
black and white 92.6
person 86.5
clothing 82.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 82.4%
Calm 65.5%
Happy 17%
Sad 7.8%
Confused 2.8%
Fear 2.2%
Surprised 1.8%
Angry 1.8%
Disgusted 1.1%

AWS Rekognition

Age 30-40
Gender Female, 94%
Calm 99.9%
Disgusted 0%
Fear 0%
Happy 0%
Sad 0%
Angry 0%
Confused 0%
Surprised 0%

AWS Rekognition

Age 45-53
Gender Female, 92.9%
Happy 91.4%
Calm 5.1%
Sad 1.3%
Angry 0.9%
Surprised 0.5%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 49-57
Gender Male, 94.4%
Calm 99.5%
Happy 0.2%
Surprised 0.1%
Fear 0.1%
Confused 0%
Disgusted 0%
Angry 0%
Sad 0%

AWS Rekognition

Age 27-37
Gender Female, 64.8%
Happy 97.3%
Sad 1.2%
Calm 1%
Disgusted 0.1%
Confused 0.1%
Surprised 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 43-51
Gender Male, 63.9%
Calm 68.5%
Sad 22.1%
Confused 3.9%
Happy 3.2%
Surprised 0.7%
Disgusted 0.7%
Angry 0.5%
Fear 0.4%

AWS Rekognition

Age 34-42
Gender Female, 87.7%
Happy 68.9%
Calm 8.6%
Confused 8%
Angry 5.6%
Sad 4.6%
Surprised 2.3%
Disgusted 1.7%
Fear 0.4%

AWS Rekognition

Age 31-41
Gender Female, 59.1%
Happy 67.1%
Calm 18.2%
Fear 4%
Sad 3.2%
Surprised 3.1%
Confused 2%
Angry 1.3%
Disgusted 1%

AWS Rekognition

Age 27-37
Gender Female, 93%
Calm 98.9%
Sad 0.4%
Surprised 0.3%
Happy 0.2%
Angry 0.1%
Fear 0.1%
Confused 0.1%
Disgusted 0%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 60.6%
Sad 17.4%
Confused 12.9%
Happy 3.9%
Disgusted 2.7%
Surprised 1.2%
Angry 1%
Fear 0.4%

AWS Rekognition

Age 30-40
Gender Male, 96.4%
Calm 69.4%
Sad 29.4%
Confused 0.5%
Surprised 0.2%
Happy 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person
Train
Person 99.5%
Person 98.8%
Person 98.6%
Person 97.4%
Person 96.4%
Person 94.9%
Person 84.9%
Person 64%
Person 46.4%
Train 92.3%

Categories

Text analysis

Amazon

1227
BOSTON
AND
BOSTON AND MAIN
MAIN
B & M

Google

B&M 1227 1227 1227 MJI7--YT37A°2 - - XAGOX
B&M
1227
MJI7--YT37A°2
-
XAGOX