Human Generated Data

Title

The Landscape

Date

1988

People

Artist: Tina Barney, American born 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Gabriella DeFerrari in honor of Henri Zerner, P1995.12

Human Generated Data

Title

The Landscape

People

Artist: Tina Barney, American born 1945

Date

1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Gabriella DeFerrari in honor of Henri Zerner, P1995.12

Machine Generated Data

Tags

Amazon
created on 2019-07-31

Person 99.9
Human 99.9
Person 99.7
Person 99.6
Person 99.6
Furniture 90.7
Couch 90.7
Wood 90.1
Flooring 90.1
Apparel 83.5
Clothing 83.5
Sitting 77.5
Hardwood 73.5
Leisure Activities 72.3
Plywood 67.9
Handrail 61.8
Banister 61.8
Shelf 55.7

Clarifai
created on 2019-07-31

man 99.7
group 99.6
four 99.3
woman 99.3
adult 99.2
people 99
room 98.2
facial expression 98.1
group together 97.5
medical practitioner 97.4
three 97.1
family 96.8
recreation 96.7
two 96.2
togetherness 95.5
wear 95.4
indoors 94.9
furniture 94.4
enjoyment 92.5
five 92.3

Imagga
created on 2019-07-31

patient 55.7
nurse 41.7
man 39.7
male 37.7
hospital 35.8
person 35.3
people 32.4
medical 30.9
senior 30
adult 29.8
home 28.8
grandfather 27.4
doctor 26.3
couple 25.3
practitioner 25.1
professional 24.7
care 23.1
indoors 22.9
elderly 22
health 21.6
happy 21.3
room 20.5
men 19.8
sick person 19.6
smiling 19.6
case 19.2
clinic 19.2
family 17.8
mature 17.7
medicine 17.6
sitting 17.2
illness 17.2
husband 16.2
occupation 15.6
bed 15.2
specialist 15
together 14.9
father 14.5
love 14.2
cheerful 13.8
child 13.7
talking 13.3
lifestyle 13
office 12.9
women 12.7
retired 12.6
hairdresser 12.6
work 12.6
happiness 12.6
uniform 12.4
wife 12.3
treatment 12
indoor 11.9
mother 11.9
stethoscope 11.5
smile 11.4
help 11.2
old 11.2
kin 11.1
worker 11
two people 10.7
face 10.7
kid 10.6
retirement 10.6
loving 10.5
boy 10.4
coat 10.4
portrait 10.4
relationship 10.3
surgery 9.8
interior 9.7
working 9.7
sick 9.7
30s 9.6
exam 9.6
lab coat 9.4
two 9.3
inside 9.2
dad 8.9
grandma 8.9
table 8.7
teacher 8.6
married 8.6
casual 8.5
holding 8.3
fun 8.2
children 8.2
group 8.1
to 8
job 8
70s 7.9
examining 7.9
son 7.8
examination 7.8
affectionate 7.8
couch 7.7
chair 7.7
pain 7.7
daughter 7.6
student 7.6
life 7.6
parent 7.3
book 7.3
aged 7.2
looking 7.2

Google
created on 2019-07-31

Sitting 54.1
Grandparent 53.7
Family 53.7

Microsoft
created on 2019-07-31

person 100
wall 97.6
indoor 96.9
man 94.5
dog 87
smile 84.1
clothing 79.9
standing 77.7
human face 75.1
carnivore 70.1
baby 51.8
animal 50

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-9
Gender Female, 91.7%
Disgusted 1.6%
Calm 11.1%
Sad 69.9%
Angry 6.4%
Happy 1.5%
Confused 3.5%
Surprised 6%

AWS Rekognition

Age 35-52
Gender Male, 98.8%
Surprised 4%
Angry 9.1%
Calm 45.4%
Confused 10.2%
Happy 12.7%
Sad 7.8%
Disgusted 10.8%

AWS Rekognition

Age 38-59
Gender Male, 98.6%
Sad 3.3%
Calm 53.3%
Surprised 2.9%
Disgusted 31.7%
Happy 1.8%
Confused 3.8%
Angry 3.1%

AWS Rekognition

Age 26-43
Gender Female, 99.9%
Surprised 4.3%
Confused 5.4%
Calm 54.7%
Happy 13%
Angry 12%
Disgusted 3.6%
Sad 7%

Microsoft Cognitive Services

Age 8
Gender Female

Microsoft Cognitive Services

Age 35
Gender Male

Microsoft Cognitive Services

Age 32
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%

Categories

Imagga

people portraits 99.8%

Captions

Azure OpenAI

Created on 2024-11-13

The image features four individuals in an indoor setting that appears to be a living room or a similar space. On the left, there is a person wearing a pink button-up shirt and khaki pants, holding and looking down at a magazine or book. In the center background, there is a framed piece of artwork hanging on the wall, depicting a seascape scene with figures in it. To the right, another individual is dressed in a white polo shirt and light-colored pants, holding a small, brown and white dog. The dog appears to be calm and alert, looking toward the right side of the frame. There are additional details in the room, including a spinning wheel, a variety of decorative items on a sideboard, and glimpses of furniture that suggest a comfortable, domestic environment. In the foreground, the top of another person's head is visible, showing curly light-colored hair. The overall attire of the individuals and the decor hint at a casual, possibly familial gathering.

Anthropic Claude

Created on 2024-11-13

The image depicts a family scene in what appears to be a living room or home. There are three people visible - two adult men and one young girl. One of the men is sitting in a chair reading a book, while the other man is standing and holding a small dog. The young girl is seated in the foreground, looking down. Behind them, there is a framed painting on the wall. The overall atmosphere seems relaxed and domestic.

Text analysis

Google

Tuis ama 42Yb.1488
Tuis
ama
42Yb.1488