Human Generated Data

Title

Dinner

Date

c. 1887

People

Artist: Adolfo Farsari, American 1841 - 1898

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Janet and Daniel Tassel, 2007.219.3.19.1

Human Generated Data

Title

Dinner

People

Artist: Adolfo Farsari, American 1841 - 1898

Date

c. 1887

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Janet and Daniel Tassel, 2007.219.3.19.1

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Human 99.6
Person 99.6
Person 98.4
Person 98.3
Clothing 97
Apparel 97
Robe 81.9
Fashion 81.9
Gown 75.4
Female 70.2
Sitting 57.5
Text 57.4
Photo 56.9
Photography 56.9
Face 56.2
Portrait 56.2

Clarifai
created on 2018-03-16

people 99.2
adult 98.2
group 98.2
woman 97.8
man 96.7
office 92.9
furniture 92.6
child 92
room 91.5
business 90.7
vehicle 89.9
wear 89.3
education 88.9
cooperation 87.3
teamwork 86.6
sitting 86.2
indoors 84.2
school 83.9
technology 83.7
facial expression 83.6

Imagga
created on 2018-03-16

bicycle 48.2
bicycle-built-for-two 46.2
wheeled vehicle 39.7
vehicle 34.8
wheelchair 32.3
man 32.2
sport 29
people 23.4
male 22.7
chair 19.6
active 18.1
conveyance 17.8
person 17.3
seat 16.5
men 16.3
bike 14.6
outdoor 14.5
lifestyle 14.4
adult 14.4
outdoors 14.2
speedway 12.7
happy 12.5
city 12.5
tricycle 12.2
fun 12
speed 11.9
exercise 11.8
transportation 11.6
urban 11.3
building 11.2
competition 11
silhouette 10.8
recreation 10.7
team 10.7
travel 10.6
stick 10.6
athlete 10.5
racetrack 10.4
furniture 10.3
motion 10.3
power 10.1
transport 10
polo mallet 9.9
health 9.7
smiling 9.4
business 9.1
attractive 9.1
activity 8.9
disabled 8.9
office 8.8
riding 8.8
day 8.6
race 8.6
architecture 8.6
wheel 8.5
fast 8.4
old 8.3
fitness 8.1
businessman 7.9
mallet 7.9
couple 7.8
smile 7.8
course 7.8
cycle 7.8
sitting 7.7
modern 7.7
winter 7.6
work 7.5
action 7.4
sports equipment 7.4
newspaper 7.3
cheerful 7.3
teenager 7.3
road 7.2
game 7.1
women 7.1
family 7.1
cool 7.1

Google
created on 2018-03-16

picture frame 82.2
art 70.4
recreation 52.7

Microsoft
created on 2018-03-16

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-22
Gender Female, 54.8%
Confused 45.5%
Sad 47.3%
Calm 50.6%
Happy 45.4%
Surprised 45.3%
Angry 45.6%
Disgusted 45.3%

AWS Rekognition

Age 19-36
Gender Female, 50.9%
Sad 53.3%
Disgusted 45.1%
Angry 45.2%
Happy 45.6%
Surprised 45.1%
Calm 45.2%
Confused 45.4%

AWS Rekognition

Age 26-43
Gender Female, 54.8%
Happy 45.3%
Surprised 45.1%
Disgusted 45.1%
Confused 45.2%
Angry 45.2%
Calm 45.6%
Sad 53.7%

Microsoft Cognitive Services

Age 27
Gender Female

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%