Human Generated Data

Title

Copy Print: Bauhaus Band

Date

1927 (printed 1949)

People

Artist: T. Lux Feininger, American 1910 - 2011

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Anonymous Gift, BR49.243

Copyright

© T. Lux Feininger

Human Generated Data

Title

Copy Print: Bauhaus Band

People

Artist: T. Lux Feininger, American 1910 - 2011

Date

1927 (printed 1949)

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Anonymous Gift, BR49.243

Copyright

© T. Lux Feininger

Machine Generated Data

Tags

Amazon
created on 2022-06-18

Clarifai
created on 2023-10-29

people 99.8
adult 98.3
group 97.3
man 96
sitting 95.7
furniture 94.6
group together 93.2
woman 91.7
sit 89
monochrome 87.8
leader 87.3
several 84.9
seat 82.5
recreation 82.5
illustration 82.2
outfit 81.7
music 80.7
many 80.7
musician 77.8
child 77.3

Imagga
created on 2022-06-18

brass 86.4
wind instrument 68.1
musical instrument 53
trombone 47.1
cornet 28.7
man 23.5
people 22.3
person 21.6
male 21.3
adult 20
music 15.7
men 15.4
black 14.4
business 13.3
lifestyle 12.3
work 11.8
concert 11.6
computer 11.3
body 11.2
laptop 11.2
youth 11.1
player 11
guitar 10.9
silhouette 10.7
musician 10.7
job 10.6
modern 10.5
rock 10.4
portrait 10.3
device 10.2
stage 10.2
bass 9.9
human 9.7
fun 9.7
businessman 9.7
office 9.6
play 9.5
entertainment 9.2
equipment 9
group 8.9
sexy 8.8
women 8.7
boy 8.7
party 8.6
dance 8.5
professional 8.4
board 8.4
hand 8.3
style 8.2
happy 8.1
working 7.9
attractive 7.7
crowd 7.7
musical 7.7
employee 7.6
studio 7.6
hot 7.5
leisure 7.5
technology 7.4
room 7.3
star 7.2
transportation 7.2
percussion instrument 7.1

Microsoft
created on 2022-06-18

text 99.6
person 94.6
musical instrument 83.9
clothing 76.6
music 72.8
man 57.8
posing 55.5
cartoon 54.5
old 53
guitar 50.4
vintage 27.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Male, 97.6%
Calm 92.1%
Surprised 6.7%
Fear 6.4%
Sad 2.8%
Angry 1.4%
Disgusted 1.1%
Confused 0.6%
Happy 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Categories