Human Generated Data

Title

Romy, Nonquitt

Date

1985

People

Artist: Sarah Benham, American born 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Artist, P1989.5

Human Generated Data

Title

Romy, Nonquitt

People

Artist: Sarah Benham, American born 1941

Date

1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Artist, P1989.5

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.2
Human 99.2
Cake 96.5
Dessert 96.5
Food 96.5
Icing 87.5
Cream 87.5
Creme 87.5
Furniture 80.5
People 79.4
Painting 61.3
Art 61.3
Sweets 60
Confectionery 60
Birthday Cake 56.7
Portrait 56.5
Photography 56.5
Face 56.5
Photo 56.5

Clarifai
created on 2023-10-25

people 99.5
girl 99.2
woman 98.6
monochrome 98.1
one 97.6
basket 96.8
food 96.5
adult 96.4
portrait 95.2
beautiful 91.2
set 90.9
art 90.3
music 90.2
lady 89.6
picnic 86.1
wedding 86
musician 85
child 85
restaurant 84.5
container 82.2

Imagga
created on 2022-01-09

container 55
shopping cart 49.1
basket 42.2
handcart 34.9
shopping basket 32.4
shopping 31.2
cart 29.2
wheeled vehicle 25.5
shop 25.3
dishwasher 24.9
buy 24.4
market 19.5
white goods 19.2
store 18.9
trolley 18.7
supermarket 17.7
bucket 17.6
sale 16.6
sitar 16.5
business 15.8
purchase 15.4
money 15.3
retail 15.2
home appliance 14.5
vessel 14.5
stringed instrument 13.7
metal 12.9
man 12.8
product 12.7
buying 12.5
people 12.3
male 12.2
commerce 12.1
happy 11.9
cash 11.9
musical instrument 11.4
push 11.4
person 11.3
empty 11.2
adult 11
finance 11
appliance 10.6
trade 10.5
one 10.4
object 10.3
grocery 9.7
pay 9.6
lifestyle 9.4
smile 9.3
pushcart 8.9
home 8.8
smiling 8.7
conveyance 8.5
black 8.4
tray 8
child 7.8
consumer 7.8
work 7.7
men 7.7
sitting 7.7
device 7.7
hand 7.6
house 7.5
dollar 7.4
conceptual 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

birthday cake 99.7
text 96.5
candle 96.1
person 93.6
cake 92.3
food 88.5
indoor 87.7
black and white 80.9
birthday 79.4
dessert 74.9
wedding cake 72.9
baked goods 61.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-44
Gender Male, 90.5%
Sad 68.2%
Calm 25.7%
Fear 3.9%
Disgusted 0.7%
Confused 0.5%
Angry 0.5%
Surprised 0.3%
Happy 0.2%

Microsoft Cognitive Services

Age 18
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Painting 61.3%

Categories

Imagga

paintings art 99.5%