Human Generated Data

Title

[Woman Reading]

Date

1930-1931

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.44.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Woman Reading]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930-1931

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.44.1

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2022-06-10

Person 86.8
Human 86.8
Clothing 81.7
Apparel 81.7
Mobile Phone 71.3
Electronics 71.3
Phone 71.3
Cell Phone 71.3
Luggage 64.7
Text 57.5

Imagga
created on 2022-06-10

business 29.2
computer 23.1
money 20.4
office 20.3
laptop 19
paper 16.6
notebook 15.9
finance 15.2
work 14.9
technology 14.8
success 14.5
black 13.2
hand 12.9
investment 11.9
cash 11.9
currency 11.7
man 10.9
businessman 10.6
keyboard 10.5
home 10.4
desk 10.3
dollar 10.2
people 10
box 10
person 9.9
male 9.9
financial 9.8
old 9.8
adult 9.7
rubbish 9.4
device 9.3
book 9.2
house 9.2
present 9.1
businesswoman 9.1
holding 9.1
suit 9
wealth 9
screen 8.9
perfume 8.8
gift 8.6
bill 8.6
bank 8.3
note 8.3
briefcase 8.1
object 8.1
key 8
job 8
silver 8
corporate 7.7
modern 7.7
toiletry 7.7
table 7.5
design 7.5
manager 7.5
monitor 7.4
close 7.4
symbol 7.4
digital 7.3
smiling 7.2
working 7.1

Google
created on 2022-06-10

White 92.2
Black 89.8
Black-and-white 86.7
Style 84.2
Font 82.7
Monochrome 78.2
Automotive design 77.2
Monochrome photography 77.1
T-shirt 73.7
Event 67.1
Sitting 65.3
Stock photography 63.1
Room 62.6
Eyewear 59.1
Reading 58.7
Street 58.6
Fun 57.6
City 54.6

Microsoft
created on 2022-06-10

text 88.1
laptop 83.4
black and white 75.2
computer 74.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 53.3%
Surprised 60%
Calm 42%
Fear 8%
Angry 4.5%
Disgusted 3.6%
Sad 3.5%
Confused 3.4%
Happy 1.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 86.8%
Mobile Phone 71.3%

Categories

Captions