Human Generated Data

Title

[Man and woman walking on grass]

Date

unknown

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.410.10

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Man and woman walking on grass]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

unknown

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-04-05

Human 99.7
Person 99.7
Apparel 99.4
Clothing 99.4
Person 91.5
Nature 84.3
Outdoors 79.1
Female 76
Grass 72.2
Plant 72.2
People 66.1
Woman 64.1
Dress 62.1
Pants 58.8
Ground 58.2
Robe 58.2
Evening Dress 58.2
Fashion 58.2
Gown 58.2
Shorts 57
Skirt 55.6

Clarifai
created on 2021-04-05

people 100
adult 98.5
two 98
group together 97.5
child 97.4
woman 95.8
man 95.7
group 94.5
one 92.8
leader 91.9
wear 90.2
three 89.4
four 85.9
several 83.8
boy 83.4
administration 82.1
family 80.9
many 79.8
street 79.1
actor 78.3

Imagga
created on 2021-04-05

crutch 37.6
staff 29.1
stick 24.2
man 22.2
people 20.1
adult 18.8
person 18.6
summer 16.7
outdoors 16.7
outdoor 16.1
male 15.8
sport 15.1
vacation 13.9
active 13.5
attractive 13.3
leisure 13.3
lifestyle 13
pedestrian 12.8
dress 12.6
walking 12.3
couple 12.2
beach 11.8
model 11.7
park 11.5
fashion 11.3
love 11
portrait 11
recreation 10.8
leg 10.6
sun 10.5
outside 10.3
child 10.1
exercise 10
sunset 9.9
tool 9.9
pretty 9.8
fun 9.7
one 9.7
body 9.6
women 9.5
relax 9.3
black 9.1
hand 9.1
cricket bat 8.8
country 8.8
hair 8.7
grass 8.7
light 8.7
two 8.5
cleaner 8.5
sand 8.4
old 8.4
sky 8.3
silhouette 8.3
life 8.2
happy 8.1
sexy 8
autumn 7.9
day 7.8
standing 7.8
mechanical device 7.7
shoes 7.7
youth 7.7
enjoy 7.5
style 7.4
water 7.3
sprinkler 7.3
alone 7.3
lady 7.3
cricket equipment 7.3
protection 7.3
danger 7.3
swing 7.1
work 7.1
travel 7
sea 7
tree 7

Google
created on 2021-04-05

Microsoft
created on 2021-04-05

outdoor 99
ground 97.4
clothing 94.7
person 90.2
wedding dress 89.9
man 88.7
bride 77.6
dress 71.2
footwear 54.9
ruin 31.3

Face analysis

Amazon

Google

AWS Rekognition

Age 51-69
Gender Female, 56.7%
Calm 83%
Sad 3.5%
Confused 3.4%
Fear 3.1%
Surprised 2.9%
Happy 2.8%
Angry 0.9%
Disgusted 0.4%

AWS Rekognition

Age 46-64
Gender Female, 64.3%
Calm 56.7%
Happy 35.9%
Sad 6.2%
Angry 0.6%
Confused 0.3%
Disgusted 0.1%
Fear 0.1%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a man and a woman walking down a dirt road 88.6%
a man walking down a dirt road 88.5%
a group of people walking down a dirt road 88.4%