Human Generated Data

Title

[A man and two women walking]

Date

1931?

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.289.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[A man and two women walking]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1931?

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.289.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Person 99.9
Human 99.9
Vegetation 99.8
Plant 99.8
Person 99.6
Person 99.4
Nature 99.1
Woodland 99.1
Land 99.1
Outdoors 99.1
Tree 99.1
Forest 99.1
Clothing 98.4
Apparel 98.4
Grove 96
Walking 94.8
Dress 92.4
Path 87.2
Standing 81.5
People 75.3
Face 74.4
Female 72.1
Yard 70
Hand 69.7
Grass 69.4
Pants 65.3
Sunlight 64.2
Jungle 61.7
Coat 61.7
Photo 60.2
Photography 60.2
Kid 59.4
Child 59.4
Urban 58.4
Man 57.2
Shorts 56.7
Pedestrian 56
Road 55.8
Woman 55.6

Clarifai
created on 2019-11-19

people 100
adult 99
man 98.3
group together 97.8
group 97.3
woman 95.1
war 94.8
military 94.2
many 94.2
child 93.8
administration 91.4
two 89.7
soldier 89.6
wear 88.1
vehicle 83.6
street 83.1
boy 82.3
calamity 82.1
recreation 81.2
police 80.1

Imagga
created on 2019-11-19

shovel 75.7
hand tool 40.1
tool 37.4
man 35.6
fire iron 19.6
male 19.2
person 18.1
water 18
people 17.8
beach 17.7
sea 15.6
sport 15.6
summer 15.4
snow 14.3
squeegee 14.2
walking 14.2
winter 13.6
adult 13.6
outdoor 13
outdoors 12.8
recreation 12.5
cleaning implement 12.4
silhouette 12.4
dark 11.7
sunset 11.7
ocean 11.6
hiking 11.5
vacation 11.5
boy 11.3
fun 11.2
lifestyle 10.8
leisure 10.8
travel 10.6
walk 10.5
forest 10.4
action 10.2
danger 10
activity 9.8
sun 9.7
men 9.4
dirty 9
wet 8.9
sky 8.9
sand 8.9
working 8.8
outside 8.6
hobby 8.5
black 8.5
environment 8.2
tree 8.2
landscape 8.2
exercise 8.2
worker 8
wall 8
mask 8
child 7.9
couple 7.8
destruction 7.8
backpack 7.8
toxic 7.8
hike 7.8
youth 7.7
adventure 7.6
park 7.4
protection 7.3
active 7.2
chain saw 7.2
mountain 7.1
cool 7.1
fisherman 7.1
weather 7.1
day 7.1
businessman 7.1
device 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

outdoor 99.8
clothing 96.8
person 92.4
tree 90.1
footwear 86.4
black and white 86.2
man 51.1
posing 38.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 50.5%
Happy 51.4%
Angry 45.1%
Surprised 46.2%
Disgusted 45%
Fear 45.7%
Sad 45.2%
Confused 45%
Calm 46.3%

AWS Rekognition

Age 22-34
Gender Male, 52%
Disgusted 45.1%
Sad 45.2%
Fear 45.1%
Happy 45.5%
Surprised 45.6%
Confused 45.1%
Calm 53.5%
Angry 45.1%

AWS Rekognition

Age 18-30
Gender Female, 52.2%
Angry 45.3%
Surprised 45.6%
Fear 45.1%
Disgusted 45%
Calm 47.4%
Confused 45.2%
Sad 51.3%
Happy 45.1%

Feature analysis

Amazon

Person 99.9%

Categories