Human Generated Data

Title

Copy Print: Members of the Bauhaus Stage Workshop in "Treppenwitz" Costumes on the Roof of the Studio Building

Date

1927 (printed c. 1948)

People

Artist: Irene Hecht Bayer, American 1898 - 1991

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Herbert Bayer, BR48.119.B

Human Generated Data

Title

Copy Print: Members of the Bauhaus Stage Workshop in "Treppenwitz" Costumes on the Roof of the Studio Building

People

Artist: Irene Hecht Bayer, American 1898 - 1991

Date

1927 (printed c. 1948)

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 98.8
Person 98.1
Person 97.3
Shoe 96.4
Footwear 96.4
Apparel 96.4
Clothing 96.4
Shoe 95
Person 94.8
Vehicle 86.3
Transportation 86.3
Person 78.3
Bomb 55.5
Weaponry 55.5
Torpedo 55.5
Weapon 55.5

Imagga
created on 2022-01-09

swing 92.6
mechanical device 65.6
plaything 65.3
mechanism 48.9
beach 30.7
sand 21.2
summer 20.6
ocean 20.1
sea 19.5
people 19.5
sun 19.3
outdoor 19.1
vacation 18.8
adult 16.9
water 16.7
person 16.6
leisure 16.6
happy 16.3
sunset 15.3
fun 15
portrait 14.9
sexy 14.5
lifestyle 14.5
relax 14.3
man 14.1
travel 14.1
hair 13.5
attractive 13.3
child 12.9
smile 12.8
relaxation 12.6
silhouette 12.4
outdoors 12.3
male 12.3
sitting 12
pretty 11.9
model 11.7
sky 11.5
holiday 11.5
enjoy 11.3
sunny 11.2
outside 11.1
bikini 11.1
youth 11.1
tropical 11.1
relaxing 10.9
recreation 10.8
fashion 10.6
women 10.3
happiness 10.2
kid 9.7
lady 9.7
play 9.5
sensual 9.1
park 9.1
childhood 9
body 8.8
sport 8.8
brunette 8.7
pole 8.6
walking 8.5
paradise 8.5
tree 8.5
rest 8.4
chair 8.4
tourism 8.2
island 8.2
playing 8.2
building 7.9
day 7.8
black 7.8
color 7.8
tan 7.7
hand 7.6
erotic 7.6
dark 7.5
holidays 7.5
landscape 7.4
shore 7.4
light 7.4
sensuality 7.3
rod 7.2

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

plane 97.4
airplane 95.9
old 93.5
outdoor 89.7
text 86.9
person 82.7
footwear 81.9
man 75.5
clothing 63.5
vintage 44.4
golf cart 34.4
aircraft 29.7

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 71.5%
Surprised 62.3%
Confused 12.1%
Calm 8.3%
Sad 7%
Fear 6.3%
Angry 1.9%
Disgusted 1.4%
Happy 0.6%

AWS Rekognition

Age 19-27
Gender Female, 82%
Sad 82.4%
Calm 6.8%
Fear 6.7%
Angry 1.3%
Disgusted 1%
Happy 0.8%
Confused 0.7%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 96.4%

Captions

Microsoft

a vintage photo of an old airplane 85.4%
a vintage photo of a group of people standing around a plane 80.5%
a vintage photo of a group of people sitting around a plane 78.3%