Human Generated Data

Title

Under the Manhattan Bridge, Brooklyn

Date

1993

People

Artist: Eugene Richards, American born 1944

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barnabas McHenry, P1999.44

Copyright

© Eugene Richards

Human Generated Data

Title

Under the Manhattan Bridge, Brooklyn

People

Artist: Eugene Richards, American born 1944

Date

1993

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barnabas McHenry, P1999.44

Copyright

© Eugene Richards

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 98.4
Person 98.4
Person 96.2
Transportation 94.9
Vehicle 94.8
Person 89
Person 85.2
Motorcycle 84.2
Person 80.4
Road 65.8
Bike 64.3
Bicycle 64.3
Vespa 57.1
Motor Scooter 57.1
Person 45.7

Clarifai
created on 2023-10-25

people 99.8
monochrome 98
street 96.7
man 96.6
adult 96.5
group together 94.7
vehicle 93.8
transportation system 93.1
group 92.6
one 88.6
sport 87.5
woman 86.5
watercraft 86.5
vintage 85.7
athlete 84
wear 83.8
recreation 83
art 82.2
illustration 81.4
sea 80.6

Imagga
created on 2022-01-09

sketch 36
drawing 30.6
architecture 25.9
building 23
structure 21
business 20
city 19.9
urban 19.2
modern 18.9
representation 18.1
sky 15.9
athletic facility 15.4
house 15
old 14.6
construction 13.7
finance 13.5
billboard 13
grunge 12.8
signboard 12.7
football stadium 12.6
tower 12.5
financial 12.5
negative 12.4
technology 11.9
film 11.3
design 11.3
art 11.1
equipment 11.1
facility 10.7
bridge 10.7
exterior 10.1
vintage 9.9
success 9.7
skyline 9.5
world 9.5
cityscape 9.5
buildings 9.4
street 9.2
office 9.1
retro 9
night 8.9
high 8.7
downtown 8.6
black 8.5
buy 8.4
stadium 8.4
town 8.3
texture 8.3
shopping cart 8.3
digital 8.1
skyscraper 8
market 8
home 8
antique 7.8
ancient 7.8
architectural 7.7
industry 7.7
outdoor 7.6
tall 7.5
frame 7.5
silhouette 7.4
style 7.4
shopping 7.3
global 7.3
new 7.3
metal 7.2
aged 7.2
landmark 7.2
history 7.2
wall 7.1
growth 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.2
black and white 89.3
person 83.3
drawing 82.4
sketch 56.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Female, 54.7%
Calm 86.7%
Fear 4.8%
Surprised 2.3%
Sad 1.7%
Disgusted 1.4%
Confused 1.3%
Angry 1.2%
Happy 0.7%

AWS Rekognition

Age 2-10
Gender Female, 83.7%
Happy 42.7%
Confused 16.2%
Calm 15.8%
Sad 11.7%
Angry 4.3%
Disgusted 4.1%
Fear 2.9%
Surprised 2.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%

Categories

Text analysis

Amazon

STOP

Google

..... STOP
.....
STOP