Human Generated Data

Title

Untitled (woman watching baby look into mirror)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16342

Human Generated Data

Title

Untitled (woman watching baby look into mirror)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16342

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Furniture 99.8
Person 99.3
Human 99.3
Person 98.5
Person 98.1
Clothing 95.5
Apparel 95.5
Cabinet 93.4
Dresser 87.1
Interior Design 75.7
Indoors 75.7
Room 72.4
Drawer 66.7
Sideboard 58.3

Clarifai
created on 2023-10-28

people 99.8
group 98.8
furniture 98.1
woman 97.2
art 96.8
child 96.5
adult 96.4
room 94.1
two 92.3
man 92.2
family 91
mirror 89.7
three 88.2
vintage 85.9
portrait 85.3
one 85.1
veil 85
seat 84.4
son 83.8
museum 83.5

Imagga
created on 2022-02-11

furniture 93.4
china cabinet 91
cabinet 80.1
furnishing 57.6
home 23.1
interior 23
buffet 22.1
television 21.7
modern 18.9
house 18.4
design 15.7
wood 15
computer 14.5
room 14
technology 13.3
screen 13.3
decor 13.2
decoration 13
light 12.7
glass 11.6
indoors 11.4
telecommunication system 11.2
luxury 11.1
business 10.9
frame 10.9
space 10.8
style 10.4
clean 10
art 9.8
working 9.7
contemporary 9.4
floor 9.3
bright 9.3
domestic 9
color 8.9
copy 8.8
work 8.6
comfortable 8.6
elegance 8.4
old 8.3
laptop 8.3
inside 8.3
case 8.1
kitchen 8
architecture 7.8
monitor 7.7
flower 7.7
communication 7.5
man 7.4
indoor 7.3
digital 7.3
smile 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 92.5
black and white 90
person 89.5
furniture 87.2
sink 73.1
sketch 66.6
drawing 62.7
drawer 53.1
baby 51.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 82.1%
Happy 56.1%
Calm 29.5%
Surprised 10%
Sad 2.2%
Fear 0.7%
Disgusted 0.6%
Angry 0.5%
Confused 0.5%

AWS Rekognition

Age 27-37
Gender Male, 99.3%
Surprised 44.1%
Calm 31.7%
Happy 16.4%
Fear 2%
Sad 1.6%
Disgusted 1.6%
Angry 1.3%
Confused 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.3%
Person 98.5%
Person 98.1%

Categories

Imagga

paintings art 63.3%
interior objects 34.7%
text visuals 1.2%

Text analysis

Amazon

8

Google

8
8