Human Generated Data

Title

Visitors at Anand Bhavan, Allahabad

Date

2000

People

Artist: Dayanita Singh, born 1961

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Purchase through the generosity of Jose M. Soriano, 2006.183

Copyright

© Dayanita Singh. Courtesy the artist and Frith Street Gallery, London

Human Generated Data

Title

Visitors at Anand Bhavan, Allahabad

People

Artist: Dayanita Singh, born 1961

Date

2000

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Furniture 99.8
Human 99.7
Person 99.7
Person 98.8
Indoors 96.9
Person 96.6
Person 96.6
Bookcase 96.3
Shelf 93.8
Room 90.9
Clothing 84.6
Apparel 84.6
Shop 76
Book 62.6
Bed 61.2
Library 59.9
Person 53.3

Clarifai
created on 2018-04-19

people 99.6
group 98.8
adult 97.6
woman 96.6
man 94.2
many 92.2
education 91.9
wear 91.7
group together 90.6
several 89.7
administration 85.3
school 83.9
two 83.9
portrait 83.2
music 82
leader 81.8
book series 81.6
five 80.9
monochrome 80.7
room 79.8

Imagga
created on 2018-04-19

turnstile 26.6
gate 22.8
people 19.5
adult 18.8
groom 17.2
movable barrier 16.6
fashion 16.6
man 16.1
shop 16.1
male 15.7
dress 15.4
building 15.3
person 15
architecture 14.3
portrait 13.6
room 13.5
business 13.4
interior 13.3
bride 12.5
old 11.8
barrier 11.6
office 10.9
happy 10.6
couple 10.4
wedding 10.1
passenger 10
smile 10
center 9.6
kin 9.5
love 9.5
happiness 9.4
model 9.3
city 9.1
mercantile establishment 9.1
attractive 9.1
call 9
one 9
window 8.9
posing 8.9
urban 8.7
entrance 8.7
books 8.7
glass 8.6
door 8.5
pretty 8.4
device 8.3
bookshop 8.3
library 8
looking 8
indoors 7.9
cute 7.9
face 7.8
modern 7.7
book 7.7
clothing 7.7
industry 7.7
musical instrument 7.7
elegance 7.6
traditional 7.5
phone 7.4
inside 7.4
indoor 7.3
new 7.3
smiling 7.2
color 7.2
home 7.2
transportation 7.2
hair 7.1
romantic 7.1

Google
created on 2018-04-19

Microsoft
created on 2018-04-19

book 97.3
indoor 96
shelf 94.2
room 91.2
scene 90.8
library 89.5
person 86
store 39.5
shop 8.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Female, 93%
Sad 86.4%
Happy 0.4%
Calm 4.7%
Surprised 0.9%
Angry 2.2%
Disgusted 3.5%
Confused 1.9%

AWS Rekognition

Age 26-43
Gender Male, 88.5%
Surprised 2.9%
Happy 1.5%
Calm 75%
Disgusted 7.7%
Sad 4.2%
Confused 5.2%
Angry 3.5%

AWS Rekognition

Age 26-43
Gender Female, 90.8%
Calm 0.4%
Sad 44.4%
Confused 0.7%
Disgusted 0.2%
Happy 53.2%
Surprised 0.7%
Angry 0.5%

AWS Rekognition

Age 26-43
Gender Female, 50.1%
Calm 49.2%
Happy 45.1%
Disgusted 45.2%
Sad 49.2%
Surprised 45.3%
Angry 45.4%
Confused 45.4%

AWS Rekognition

Age 35-52
Gender Male, 85.6%
Happy 11.6%
Angry 7.7%
Calm 37.3%
Surprised 7.7%
Sad 13.2%
Confused 8.6%
Disgusted 13.9%

AWS Rekognition

Age 14-25
Gender Female, 53%
Happy 0.6%
Calm 0.3%
Confused 0.3%
Surprised 0.2%
Angry 1.2%
Disgusted 0.8%
Sad 96.5%

AWS Rekognition

Age 26-43
Gender Male, 50.5%
Happy 49.5%
Angry 49.6%
Calm 49.6%
Disgusted 49.7%
Sad 49.7%
Surprised 49.6%
Confused 49.7%

Microsoft Cognitive Services

Age 43
Gender Male

Microsoft Cognitive Services

Age 50
Gender Male

Microsoft Cognitive Services

Age 41
Gender Female

Microsoft Cognitive Services

Age 40
Gender Female

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people standing in front of a book shelf 69.3%
a group of people standing next to a book shelf 69.2%
a group of people in front of a book shelf 65.8%