Human Generated Data

Title

Joe Ambrosio and Crew, Janes Lane, Lloyd Harbor, New York

Date

September 1996

People

Artist: N. Jay Jaffee, American 1921 - 1999

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Artist, P1998.132

Copyright

© The N. Jay Jaffee Trust. All rights reserved. Used by permission. www.njayjaffee.com

Human Generated Data

Title

Joe Ambrosio and Crew, Janes Lane, Lloyd Harbor, New York

People

Artist: N. Jay Jaffee, American 1921 - 1999

Date

September 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Artist, P1998.132

Copyright

© The N. Jay Jaffee Trust. All rights reserved. Used by permission. www.njayjaffee.com

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.7
Human 99.7
Person 99.5
Furniture 99.5
Person 99.3
Hammock 66.2
Face 56.6

Clarifai
created on 2023-10-25

people 99.9
group 98.8
group together 98.7
vehicle 98
man 97.8
three 97.3
transportation system 96.8
two 96.6
adult 95.3
child 88.4
waste 88.4
wear 88.1
calamity 87.1
one 84.3
bucket 84
documentary 81.8
war 81.5
boy 80.3
four 80.2
soldier 79.7

Imagga
created on 2021-12-14

plow 74.6
tool 61.3
barrow 24.1
handcart 20.4
man 20.1
wheeled vehicle 19
vehicle 18.1
outdoor 16.8
old 16
field 15.9
people 14.5
landscape 14.1
outdoors 13.4
sky 13.4
summer 12.9
vintage 12.4
person 12
male 11.3
countryside 11
travel 10.6
black 10.2
sport 10
rural 9.7
work 9.6
lifestyle 9.4
silhouette 9.1
environment 9
vacation 9
active 9
fun 9
transportation 9
sun 8.8
working 8.8
wreckage 8.8
building 8.7
grass 8.7
outside 8.6
industry 8.5
two 8.5
adult 8.4
leisure 8.3
protection 8.2
danger 8.2
activity 8.1
farm 8
mountain 8
day 7.8
destruction 7.8
life 7.8
track 7.8
cloud 7.7
truck 7.7
grunge 7.7
dirt 7.6
part 7.5
machine 7.4
shovel 7.3
dirty 7.2
road 7.2
sunset 7.2
sand 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.5
person 89.1
outdoor 85.9
black and white 78.5
man 75.1
clothing 67.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Male, 99.4%
Calm 66.9%
Sad 25.8%
Angry 3.5%
Confused 2.5%
Disgusted 0.4%
Surprised 0.4%
Happy 0.2%
Fear 0.2%

AWS Rekognition

Age 22-34
Gender Male, 97.1%
Calm 89.4%
Sad 3.7%
Angry 3.3%
Confused 1.5%
Fear 1%
Surprised 0.5%
Disgusted 0.4%
Happy 0.3%

AWS Rekognition

Age 39-57
Gender Male, 98.1%
Angry 94.2%
Calm 5.3%
Confused 0.2%
Sad 0.2%
Disgusted 0.2%
Surprised 0%
Happy 0%
Fear 0%

Microsoft Cognitive Services

Age 49
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Imagga

paintings art 100%

Captions

Microsoft
created on 2021-12-14

a person sitting on the ground 54.9%

Text analysis

Amazon

FORD
COUNT
DEVITIONS
GATTER
wan GATTER
wan

Google

COUNT FORD
COUNT
FORD