Human Generated Data

Title

[Dessau]

Date

1930

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.56.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Dessau]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-06-10

Person 98.3
Human 98.3
Person 97.3
Clothing 93.1
Apparel 93.1
Person 89.6
People 77.4
Outdoors 69.2
Mammal 58.9
Animal 58.9
Duel 55.7

Imagga
created on 2022-06-10

child 40.7
people 29
groom 26.7
kin 24.6
couple 24.4
person 22.9
dress 22.6
man 22.2
adult 21.4
love 21.3
beach 21.1
outdoors 19.5
bride 19
male 18.5
happiness 18
sand 17.7
wedding 17.5
summer 17.4
fashion 16.6
happy 14.4
portrait 14.2
women 14.2
walking 14.2
juvenile 13.7
married 13.4
together 13.1
vacation 13.1
fun 12.7
travel 12.7
elegance 12.6
joy 12.5
lifestyle 12.3
water 12
human 12
marriage 11.4
men 11.2
two 11
model 10.9
smiling 10.9
ocean 10.8
romance 10.7
parent 10.6
sea 10.2
clothing 10.1
family 9.8
cool 9.8
mother 9.5
wife 9.5
youth 9.4
face 9.2
life 9.2
park 9.1
old 9.1
sunset 9
sky 8.9
romantic 8.9
art 8.8
husband 8.8
one 8.2
cheerful 8.1
active 8.1
posing 8
celebration 8
day 7.8
father 7.8
smile 7.8
attractive 7.7
outdoor 7.6
snow 7.6
walk 7.6
sport 7.6
clothes 7.5
silhouette 7.5
world 7.4
joyful 7.4
sun 7.2
sexy 7.2
cute 7.2
hair 7.1
sunlight 7.1

Google
created on 2022-06-10

Microsoft
created on 2022-06-10

outdoor 98.6
ground 98.5
drawing 89.1
clothing 84.6
person 84.4
black and white 77.6
sketch 71.3
text 66
painting 63.8
woman 55.5
old 44.9

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%

Captions

Microsoft

a group of people walking down a dirt road 88.9%
a group of women walking down a dirt road 83.4%
a person walking down a dirt road 83.3%

Text analysis

Google

܂
܀
܂ ܀