Human Generated Data

Title

[Artifact in Moritzburg Museum, Halle]

Date

1930-1931

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.15.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Artifact in Moritzburg Museum, Halle]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930-1931

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.15.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2022-05-28

Person 92.8
Human 92.8
Art 83.4
Sculpture 82.7
Figurine 82.1
Statue 58.8
Archaeology 58.6
Outdoors 55.3

Imagga
created on 2022-05-28

washer 100
white goods 100
home appliance 94.4
appliance 68.2
durables 31.5
clean 17.5
container 16.3
washing 15.5
close 14.3
bucket 14.1
car 13.8
home 13.5
wash 13.5
door 13.3
water 12.7
dirty 12.6
metal 12.1
bathroom 11.7
interior 11.5
vehicle 11.2
cold 11.2
domestic 10.8
closeup 10.8
vessel 10.8
silver 10.6
house 10
equipment 9.8
old 9.7
cleaning 9.7
detail 9.6
bath 9.5
inside 9.2
automobile 8.6
machine 8.5
travel 8.4
color 8.3
transportation 8.1
wet 8
laundry 8
housework 7.8
people 7.8
auto 7.6
drive 7.6
chrome 7.5
safety 7.4
window 7.3
snow 7.3
drop 7.2
paint 7.2
road 7.2
clothing 7.2
face 7.1
person 7.1

Google
created on 2022-05-28

Microsoft
created on 2022-05-28

human face 93.9
sketch 92.5
drawing 91.3
black and white 87
white 68.7
person 63.5
text 63.3

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 30-40
Gender Male, 91.4%
Happy 38.9%
Angry 19.3%
Calm 17.4%
Fear 11.6%
Surprised 8.3%
Confused 4.3%
Sad 3.4%
Disgusted 2.4%

Microsoft Cognitive Services

Age 32
Gender Female

Feature analysis

Amazon

Person 92.8%

Categories

Text analysis

Google

4