Human Generated Data

Title

[Woman Reading]

Date

1930-1931

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.44.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Woman Reading]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930-1931

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.44.2

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2022-06-10

Human 89.3
Pillow 88.6
Cushion 88.6
Clothing 86.3
Apparel 86.3
Furniture 85.8
Monitor 85.6
Display 85.6
Screen 85.6
Electronics 85.6
Person 84.4
Mobile Phone 81.5
Phone 81.5
Cell Phone 81.5
LCD Screen 73.3
Bed 68.5
Face 61.9
Bedroom 59.4
Room 59.4
Indoors 59.4
Suit 56.6
Coat 56.6
Overcoat 56.6

Imagga
created on 2022-06-10

business 34
money 26.4
finance 22.8
paper 21.3
computer 20.1
office 20.1
laptop 18.1
notebook 16.9
dollar 16.7
currency 16.2
packet 15.8
pen 15.6
device 15.2
investment 14.7
success 14.5
financial 14.3
work 14.1
technology 14.1
note 13.8
desk 13.6
container 13.5
wealth 13.5
bank 13.4
bill 13.3
package 13.2
monitor 13.1
cash 12.8
perfume 12.1
document 11.1
banking 11
key 10.8
hand 10.6
keyboard 10.6
toiletry 10.2
data 10
crossword puzzle 9.9
information 9.7
screen 9.7
design 9.2
house 9.2
close 9.1
holding 9.1
businessman 8.8
object 8.8
film 8.8
home 8.8
table 8.7
banknote 8.7
loan 8.6
corporate 8.6
plan 8.5
stock 8.4
black 8.4
adult 8.4
rich 8.4
man 8.1
puzzle 8
market 8
job 8
book 7.9
equipment 7.8
dollars 7.7
payment 7.7
old 7.7
exchange 7.6
communication 7.6
writing 7.5
sign 7.5
sale 7.4
retro 7.4
digital 7.3
person 7.3
businesswoman 7.3
negative 7.1
concepts 7.1
copy 7.1
silver 7.1

Google
created on 2022-06-10

Microsoft
created on 2022-06-10

computer 96.8
laptop 95
text 90.8
black and white 84.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 64.6%
Calm 81%
Sad 8.5%
Surprised 7.2%
Fear 6.9%
Disgusted 1.8%
Angry 1.3%
Confused 1.3%
Happy 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 84.4%
Mobile Phone 81.5%

Categories

Imagga

paintings art 99.9%

Captions

Text analysis

Amazon

USC