Human Generated Data

Title

Untitled (woman arranging flower centerpiece)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16658

Human Generated Data

Title

Untitled (woman arranging flower centerpiece)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16658

Machine Generated Data

Tags

Amazon
created on 2022-02-18

Person 98.6
Human 98.6
Person 95.1
Indoors 66
Face 64.2
Worker 59.9
Building 55.2

Clarifai
created on 2023-10-29

people 99.8
furniture 98.5
man 97.4
adult 96.9
leader 95.2
one 95.2
room 95.1
group 93.4
two 93
three 91.5
chair 90.4
indoors 88.5
administration 88.2
home 86.9
group together 82.7
family 81.2
elderly 79
monochrome 78.8
several 78.8
seat 76.3

Imagga
created on 2022-02-18

musical instrument 29.7
percussion instrument 24.8
blackboard 18.8
person 18.2
people 17.3
man 16.8
marimba 16.5
male 15.6
black 13.8
old 12.5
device 11.7
silhouette 11.6
interior 11.5
home 11.2
grunge 11.1
adult 10.9
design 10.7
art 10.6
men 10.3
elegance 10.1
room 9.6
lifestyle 9.4
business 9.1
portrait 9.1
player 9
work 8.9
indoors 8.8
symbol 8.7
light 8.7
decoration 8.7
wall 8.5
window 8.4
electronic instrument 8.4
house 8.3
drawing 8.3
vintage 8.3
office 8.1
professional 7.8
education 7.8
bartender 7.7
happy 7.5
event 7.4
vibraphone 7.3
team 7.2
classroom 7.1
businessman 7.1
architecture 7
modern 7

Google
created on 2022-02-18

Black 89.7
Plant 88.2
Picture frame 84.1
Black-and-white 83.9
Style 83.8
Window 82.6
Rectangle 77.5
Art 77.5
Font 76.7
Flower 76.3
Monochrome photography 74.4
Snapshot 74.3
Monochrome 72.3
Glass 71.1
Event 70.6
Cabinetry 66.8
Room 66
Darkness 64.5
Still life photography 64.5
Visual arts 63.2

Microsoft
created on 2022-02-18

text 99.2
drawing 98
person 90
clothing 84.7
sketch 84.7
painting 71.4
cartoon 71.3
man 71.2
black and white 60.9
old 60.6
altar 11.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 76.4%
Calm 54%
Happy 30.8%
Surprised 13.3%
Sad 0.7%
Confused 0.3%
Fear 0.3%
Angry 0.3%
Disgusted 0.3%

AWS Rekognition

Age 38-46
Gender Male, 90.6%
Calm 69.1%
Fear 13.6%
Surprised 8.6%
Happy 2.4%
Confused 1.7%
Sad 1.7%
Disgusted 1.5%
Angry 1.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.6%
Person 95.1%

Captions

Text analysis

Amazon

010
RODVR-EVEELA