Human Generated Data

Title

K. Family (Mason / Handyman, Nurse / Housewife), Berlin-Pankow

Date

1984

People

Artist: Christian Borchert, German 1942 - 2000

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Antonia Paepcke DuBrul Fund, 2018.216

Copyright

© Estate of Christian Borchert

Human Generated Data

Title

K. Family (Mason / Handyman, Nurse / Housewife), Berlin-Pankow

People

Artist: Christian Borchert, German 1942 - 2000

Date

1984

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Antonia Paepcke DuBrul Fund, 2018.216

Copyright

© Estate of Christian Borchert

Machine Generated Data

Tags

Amazon
created on 2019-07-20

Human 99.7
Person 99.7
Person 99.7
Person 99.6
Person 99.6
Person 99.1
Person 99
Person 99
Person 99
Person 98.1
Furniture 96.9
Chair 96.9
Person 94.1
School 94
Room 93.4
Indoors 93.4
Classroom 93
Person 92.9
Chair 91.6
Person 90.5
Apparel 86.7
Clothing 86.7
Table 75.9
People 67.9
Kindergarten 65.9
Face 62.8
Kid 60.9
Child 60.9
Workshop 58.2
Chair 55.8
Person 52.2
Person 46

Clarifai
created on 2019-07-20

group 99.8
people 99.7
adult 98.2
group together 97.9
child 97.8
woman 96.6
facial expression 96.4
education 96.1
many 94.5
man 94.2
elementary school 92.3
four 91.9
furniture 91.7
school 91.4
five 91.2
several 90.4
boy 90.3
room 88.6
teacher 87.9
offspring 87.6

Imagga
created on 2019-07-20

classroom 100
room 100
people 38.5
man 32.9
male 29.2
person 27.5
adult 26.4
group 25
home 23.9
smiling 23.9
teacher 23.7
indoors 22
happy 21.3
child 21
men 20.6
women 20.6
together 20.2
businesswoman 20
education 19.9
business 19.4
businessman 19.4
kin 19.3
table 18.2
team 17.9
meeting 17.9
family 17.8
office 17.7
student 17.3
couple 16.6
mother 16.5
laptop 16.4
cheerful 16.3
desk 16.1
computer 16
smile 15.7
20s 15.6
businesspeople 15.2
school 14.7
sitting 14.6
portrait 14.2
professional 14.2
lifestyle 13.7
children 13.7
color 13.4
working 13.3
boy 13
mature 13
teamwork 13
discussion 12.7
work 12.6
happiness 12.5
camera 12
students 11.7
colleagues 11.7
class 11.6
interior 11.5
adults 11.4
friends 11.3
learning 11.3
senior 11.2
corporate 11.2
executive 11.1
book 11
indoor 11
holding 10.7
40s 10.7
books 10.6
to 10.6
30s 10.6
college 10.4
love 10.3
girls 10
father 9.9
reading 9.5
talking 9.5
study 9.3
casual 9.3
teen 9.2
nurse 9
discussing 8.8
daughter 8.8
looking 8.8
standing 8.7
mid adult 8.7
husband 8.6
workplace 8.6
females 8.5
occupation 8.3
successful 8.2
clothing 8.1
childhood 8.1
success 8
worker 8
son 7.9
explaining 7.9
coworkers 7.9
cooperation 7.7
studying 7.7
elderly 7.7
educator 7.6
friendship 7.5
manager 7.5
new 7.3
parent 7.1

Google
created on 2019-07-20

Social group 91.1
Room 78
Event 69.4
Class 67.1
Family 63
Black-and-white 56.4
Monochrome 54.4
House 53.7

Microsoft
created on 2019-07-20

person 99.6
clothing 97.9
family 93.8
human face 92.5
smile 90.8
table 71.7
group 68.7
people 67.8
woman 64.2
furniture 64
school 62.3
posing 41.7
restaurant 29

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Male, 100%
Disgusted 6.7%
Sad 5.9%
Happy 15.6%
Confused 10.1%
Angry 11.3%
Surprised 2.8%
Calm 47.6%

AWS Rekognition

Age 4-9
Gender Female, 97%
Happy 2.9%
Sad 71.5%
Disgusted 1.1%
Confused 2.3%
Calm 5.3%
Surprised 1.8%
Angry 15.2%

AWS Rekognition

Age 4-9
Gender Male, 99.3%
Angry 1.6%
Disgusted 0.3%
Happy 0.3%
Sad 3.2%
Calm 90.6%
Surprised 0.4%
Confused 3.7%

AWS Rekognition

Age 9-14
Gender Male, 98.4%
Disgusted 0.1%
Surprised 0.4%
Angry 0.3%
Calm 97.7%
Confused 0.6%
Happy 0.3%
Sad 0.6%

AWS Rekognition

Age 4-9
Gender Female, 76%
Calm 3.9%
Sad 85.7%
Angry 6.3%
Surprised 0.8%
Happy 1.7%
Confused 1%
Disgusted 0.6%

AWS Rekognition

Age 4-7
Gender Female, 96%
Calm 88.6%
Sad 3.2%
Angry 2%
Surprised 2.5%
Happy 0.8%
Confused 2.3%
Disgusted 0.6%

AWS Rekognition

Age 12-22
Gender Male, 99.6%
Disgusted 2.8%
Sad 4.3%
Confused 4.1%
Surprised 1.2%
Happy 65.9%
Angry 3.4%
Calm 18.4%

AWS Rekognition

Age 15-25
Gender Male, 96.6%
Disgusted 2.7%
Surprised 2.7%
Calm 72.2%
Sad 5.7%
Angry 8.3%
Happy 2.5%
Confused 5.9%

AWS Rekognition

Age 6-13
Gender Female, 64.4%
Calm 93.5%
Sad 4.7%
Angry 0.4%
Surprised 0.2%
Happy 0.2%
Confused 0.7%
Disgusted 0.2%

AWS Rekognition

Age 20-38
Gender Female, 57.5%
Disgusted 2.9%
Sad 2.5%
Happy 1.7%
Confused 2.4%
Angry 2.4%
Surprised 0.8%
Calm 87.3%

AWS Rekognition

Age 11-18
Gender Male, 94.9%
Disgusted 5.8%
Surprised 1.7%
Angry 13.5%
Calm 58.3%
Confused 7.1%
Happy 1.7%
Sad 11.9%

AWS Rekognition

Age 35-52
Gender Female, 96.5%
Happy 94.9%
Calm 1.5%
Confused 0.7%
Angry 0.6%
Sad 0.6%
Disgusted 1.4%
Surprised 0.4%

Microsoft Cognitive Services

Age 33
Gender Male

Microsoft Cognitive Services

Age 37
Gender Female

Microsoft Cognitive Services

Age 25
Gender Male

Microsoft Cognitive Services

Age 27
Gender Male

Microsoft Cognitive Services

Age 24
Gender Male

Microsoft Cognitive Services

Age 9
Gender Female

Microsoft Cognitive Services

Age 4
Gender Male

Microsoft Cognitive Services

Age 9
Gender Female

Microsoft Cognitive Services

Age 12
Gender Male

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 1
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Chair 96.9%

Categories

Imagga

people portraits 100%

Captions

Azure OpenAI

Created on 2024-11-13

The image is a black and white photograph that appears to be taken in the interior of a home, possibly dating back a few decades, given the style of clothing and decor. The space is furnished with wooden cabinets, a table with a striped tablecloth, chairs, and decorative items like pennants, dishes in a glass-fronted cabinet, and wall decorations that include posters or pictures. The room is carpeted, and on the walls, there is patterned wallpaper. Seven individuals are visible standing together as if posing for a family portrait. They are dressed in an assortment of clothing that includes patterned and striped tops, dresses, and skirts, suggesting a variety of ages among them. The attire looks quite traditional, further hinting at the photo being from an earlier era. A bowl or dish rests on the table in the forefront, and overall, the scene resembles a mid-century domestic setting.

Anthropic Claude

Created on 2024-11-13

The image appears to be a family portrait. It shows a group of people, mostly adults and children, standing and sitting in what looks like a school or community center setting. The individuals are dressed in clothing typical of the mid-20th century, with the adults wearing casual outfits like plaid shirts and the children wearing sweaters and other casual attire. The room has wooden lockers or cabinets along the walls, and there are some decorative items like posters or artwork hanging on the walls. The people in the image seem to be a family or group, possibly posed for a formal portrait or photograph.

Meta Llama

Created on 2024-11-21

The image is a black and white photograph of a family of 11 standing in a kitchen. The family consists of two adults, a man and a woman, and nine children, ranging in age from infant to teenager. The man is holding a baby in his arms, while the woman stands beside him. The children are dressed in various outfits, including dresses, sweaters, and button-down shirts. In the foreground, there is a table with chairs and a bowl on it. The background of the image features a kitchen with wooden cabinets and a wall covered in wallpaper. There are also some decorative items on the wall, such as a clock and a picture frame. Overall, the image appears to be a family portrait taken in the 1960s or 1970s, based on the clothing and decor of the time period.

Text analysis

Amazon

C

Google

C C wndor TJ
C
wndor
TJ