Human Generated Data

Title

Orient Point Ferry

Date

1978

People

Artist: Elaine Mayes, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1674

Copyright

© Elaine Mayes

Human Generated Data

Title

Orient Point Ferry

People

Artist: Elaine Mayes, American born 1938

Date

1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1674

Copyright

© Elaine Mayes

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.8
Sitting 98.9
Furniture 97.8
Person 96.8
Person 94.7
Person 82.2
Bench 79.2
Musical Instrument 69.7
Leisure Activities 69.7
Piano 69.7
Outdoors 69.1
Vehicle 65.5
Transportation 65.5
Nature 63.5
Person 63.1
Couch 57
Ferry 56.7
Boat 56.7

Clarifai
created on 2023-10-25

people 99.9
monochrome 97.8
transportation system 97.6
train 97.3
man 97.1
adult 96.2
group 95.5
group together 94.9
woman 94.9
railway 93.8
vehicle 91.4
three 87.2
locomotive 87
child 82.9
two 81.4
vintage 80.6
street 79.6
subway system 79.4
chair 79.3
airport 79.2

Imagga
created on 2022-01-09

passenger 44.8
interior 26.5
building 24.8
architecture 21.6
modern 20.3
business 20
window 19
urban 18.3
people 17.8
travel 17.6
inside 17.5
silhouette 17.4
shop 16.1
city 15
industry 14.5
barbershop 14.3
indoor 13.7
home 13.5
office 13.4
transportation 12.5
chair 12.5
structure 11.9
industrial 11.8
mercantile establishment 11.5
floor 11.1
gate 11
house 10.9
airport 10.7
room 10.6
indoors 10.5
black 10.2
glass 10.1
man 10.1
reflection 10
departure 9.8
to 9.7
steel 9.7
life 9.7
factory 9.6
scene 9.5
journey 9.4
light 9.3
men 8.6
construction 8.5
seat 8.4
sky 8.3
metal 8
water 8
women 7.9
station 7.7
motion 7.7
old 7.7
person 7.7
work 7.7
place of business 7.7
counter 7.6
furniture 7.6
wood 7.5
tourism 7.4
night 7.1
table 7.1
working 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.1
black and white 97
person 90.2
piano 85.7
clothing 58.7
posing 41.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 70.5%
Sad 90.8%
Calm 6.9%
Happy 1.2%
Confused 0.4%
Angry 0.3%
Fear 0.2%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 18-24
Gender Male, 74.7%
Calm 58.9%
Surprised 24.8%
Fear 4.4%
Disgusted 3.8%
Confused 3.1%
Happy 2.8%
Sad 1.5%
Angry 0.6%

AWS Rekognition

Age 23-31
Gender Female, 86.1%
Sad 63.7%
Calm 15%
Surprised 8.9%
Angry 5.6%
Confused 2.5%
Fear 1.7%
Disgusted 1.5%
Happy 1.1%

AWS Rekognition

Age 23-33
Gender Male, 59.1%
Calm 97.5%
Happy 1.1%
Sad 0.4%
Disgusted 0.4%
Surprised 0.2%
Confused 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 19-27
Gender Female, 96.8%
Calm 70.7%
Sad 16.4%
Angry 5.9%
Happy 2.4%
Fear 1.4%
Disgusted 1.3%
Surprised 1.1%
Confused 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Piano 69.7%

Categories

Imagga

paintings art 98.9%

Text analysis

Amazon

ROOMS
REST ROOMS
REST
3.3
Idea
Chica

Google

REST ROOMS
REST
ROOMS