Human Generated Data

Title

Herbert Bayer

Date

1927

People

Artist: Irene Hecht Bayer, American 1898 - 1991

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Herbert Bayer, BR51.273

Human Generated Data

Title

Herbert Bayer

People

Artist: Irene Hecht Bayer, American 1898 - 1991

Date

1927

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 98.4
Human 98.4
Text 94.6
Furniture 88.6
Table 67
Sitting 66.3
Desk 65.5
Portrait 62.2
Photo 62.2
Face 62.2
Photography 62.2
Accessory 60.1
Accessories 60.1
Tie 60.1

Imagga
created on 2022-01-29

newspaper 58.4
product 49.2
laptop 42.8
computer 42.2
person 40.2
creation 38.9
office 36.1
working 34.5
work 33.8
business 33.4
scholar 33.4
sitting 31.8
intellectual 29.7
people 29.6
desk 28.5
adult 26
man 25.6
daily 23.1
smiling 22.4
looking 22.4
home 22.4
technology 22.3
smile 22.1
male 21.4
paper 21.3
student 21.1
notebook 21
pen 20.6
book 19.4
happy 19.4
professional 18.2
businessman 17.7
indoors 17.6
education 17.3
studying 17.3
reading 17.1
casual 17
attractive 16.1
table 15.7
portrait 15.5
corporate 15.5
writing 15.1
wireless 14.3
communication 14.3
women 14.2
college 14.2
worker 14.2
one 14.2
job 14.2
keyboard 14.1
businesswoman 13.6
mature 13
success 12.9
successful 12.8
color 12.2
study 12.1
executive 12
happiness 11.8
using 11.6
sofa 11.5
lady 11.4
write 11.3
phone 11.1
lifestyle 10.9
hand 10.6
room 10.5
university 10.5
modern 10.5
pretty 10.5
workplace 10.5
men 10.3
expression 10.2
document 10.2
telephone 10
handsome 9.8
hands 9.6
one person 9.4
learning 9.4
senior 9.4
manager 9.3
indoor 9.1
studio 9.1
relaxing 9.1
suit 9
school 9
browsing 8.8
jacket 8.7
serious 8.6
face 8.5
career 8.5
sit 8.5
finance 8.5
holding 8.3
alone 8.2
confident 8.2
blond 8.1
cute 7.9
seated 7.8
assistant 7.8
couch 7.7
call 7.6
horizontal 7.5
leisure 7.5
single 7.4
friendly 7.3
team 7.2
bright 7.2
interior 7.1
look 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 99
person 98.7
human face 93.4
man 93.2
sketch 91.6
drawing 89.8
clothing 82.8
old 80.1
handwriting 69.7
black and white 68.5
portrait 59.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Sad 69.9%
Calm 26.4%
Confused 1.9%
Angry 0.5%
Disgusted 0.5%
Fear 0.3%
Happy 0.2%
Surprised 0.2%

Microsoft Cognitive Services

Age 33
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%
Tie 60.1%

Captions

Microsoft

a man sitting on a newspaper 61.9%
a man sitting on a table 61.8%
a vintage photo of a man sitting on a table 61.7%

Text analysis

Amazon

DESSAU