Human Generated Data

Title

[Lyonel Feininger holding model yacht]

Date

c. 1930

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.671.3

Human Generated Data

Title

[Lyonel Feininger holding model yacht]

People

Artist: Unidentified Artist,

Date

c. 1930

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.671.3

Machine Generated Data

Tags

Amazon
created on 2022-07-01

Person 97.9
Human 97.9
Clothing 94.7
Apparel 94.7
Face 72.6
Portrait 61.3
Photo 61.3
Photography 61.3
Performer 60
Female 58.8
Hat 56.8

Imagga
created on 2022-07-01

stringed instrument 58.5
musical instrument 47
piano 43
laptop 40.9
man 38.3
grand piano 37.5
keyboard instrument 36
person 35.6
business 35.2
businessman 33.6
computer 32.2
percussion instrument 31.5
people 31.2
bowed stringed instrument 31.2
male 29.8
adult 28
working 25.6
office 22.6
professional 22.4
corporate 22.3
work 22
suit 21.6
device 20.4
viol 19.6
notebook 19.4
businesswoman 19.1
smiling 18.8
happy 18.8
portrait 18.8
success 18.5
executive 18.3
looking 17.6
smile 17.1
technology 16.3
worker 16
job 15.9
scholar 15.7
successful 15.6
hand 15.2
washboard 14.5
manager 14
upright 13.7
lifestyle 13.7
intellectual 13.5
one 13.4
holding 13.2
confident 12.7
student 12.7
handsome 12.5
boss 12.4
sitting 12
indoors 11.4
standing 11.3
attractive 11.2
occupation 11
employee 10.5
businesspeople 10.4
career 10.4
education 10.4
bass 10.3
casual 10.2
alone 10
cello 9.9
black 9.6
women 9.5
men 9.5
paper 9.4
senior 9.4
finance 9.3
communication 9.2
face 9.2
indoor 9.1
modern 9.1
team 9
harpsichord 8.9
home 8.8
using 8.7
table 8.7
day 8.6
teacher 8.5
mature 8.4
studio 8.4
book 8.2
collar 7.7
confidence 7.7
serious 7.6
reading 7.6
telephone 7.6
one person 7.5
contemporary 7.5
keyboard 7.5
voting machine 7.3
cheerful 7.3
music 7.3
playing 7.3
group 7.3
school 7.2
building 7.1
clavier 7.1

Google
created on 2022-07-01

Microsoft
created on 2022-07-01

person 96.4
text 96
man 93.9
indoor 89.5
black and white 52

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 90.8%
Calm 97.7%
Surprised 6.3%
Fear 5.9%
Sad 2.9%
Happy 0%
Angry 0%
Confused 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%

Text analysis

Google

電動 A 408 10*760110
A
電動
408
10
*
760110